Dec 11 05:14:56 crc systemd[1]: Starting Kubernetes Kubelet... Dec 11 05:14:56 crc restorecon[4615]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:56 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 05:14:57 crc restorecon[4615]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 11 05:14:57 crc restorecon[4615]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 11 05:14:57 crc kubenswrapper[4628]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 11 05:14:57 crc kubenswrapper[4628]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 11 05:14:57 crc kubenswrapper[4628]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 11 05:14:57 crc kubenswrapper[4628]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 11 05:14:57 crc kubenswrapper[4628]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 11 05:14:57 crc kubenswrapper[4628]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.724675 4628 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728068 4628 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728086 4628 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728092 4628 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728099 4628 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728105 4628 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728110 4628 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728116 4628 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728121 4628 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728127 4628 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728133 4628 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728140 4628 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728146 4628 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728150 4628 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728156 4628 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728161 4628 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728166 4628 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728171 4628 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728175 4628 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728180 4628 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728186 4628 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728207 4628 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728213 4628 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728219 4628 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728225 4628 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728231 4628 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728236 4628 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728241 4628 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728245 4628 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728250 4628 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728255 4628 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728260 4628 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728264 4628 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728269 4628 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728274 4628 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728279 4628 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728284 4628 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728289 4628 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728294 4628 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728298 4628 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728303 4628 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728308 4628 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728313 4628 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728319 4628 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728324 4628 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728329 4628 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728334 4628 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728338 4628 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728343 4628 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728348 4628 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728352 4628 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728357 4628 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728362 4628 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728366 4628 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728371 4628 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728376 4628 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728381 4628 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728387 4628 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728392 4628 feature_gate.go:330] unrecognized feature gate: Example Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728396 4628 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728401 4628 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728406 4628 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728411 4628 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728415 4628 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728422 4628 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728429 4628 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728434 4628 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728440 4628 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728445 4628 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728450 4628 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728454 4628 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.728459 4628 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728759 4628 flags.go:64] FLAG: --address="0.0.0.0" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728772 4628 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728924 4628 flags.go:64] FLAG: --anonymous-auth="true" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728933 4628 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728940 4628 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728946 4628 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728954 4628 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728960 4628 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728966 4628 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728972 4628 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728978 4628 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728984 4628 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728990 4628 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.728995 4628 flags.go:64] FLAG: --cgroup-root="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729001 4628 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729006 4628 flags.go:64] FLAG: --client-ca-file="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729012 4628 flags.go:64] FLAG: --cloud-config="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729017 4628 flags.go:64] FLAG: --cloud-provider="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729022 4628 flags.go:64] FLAG: --cluster-dns="[]" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729031 4628 flags.go:64] FLAG: --cluster-domain="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729036 4628 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729043 4628 flags.go:64] FLAG: --config-dir="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729048 4628 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729054 4628 flags.go:64] FLAG: --container-log-max-files="5" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729061 4628 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729067 4628 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729073 4628 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729079 4628 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729084 4628 flags.go:64] FLAG: --contention-profiling="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729090 4628 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729095 4628 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729101 4628 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729106 4628 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729113 4628 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729119 4628 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729125 4628 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729130 4628 flags.go:64] FLAG: --enable-load-reader="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729136 4628 flags.go:64] FLAG: --enable-server="true" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729141 4628 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729148 4628 flags.go:64] FLAG: --event-burst="100" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729153 4628 flags.go:64] FLAG: --event-qps="50" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729159 4628 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729164 4628 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729170 4628 flags.go:64] FLAG: --eviction-hard="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729177 4628 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729182 4628 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729187 4628 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729193 4628 flags.go:64] FLAG: --eviction-soft="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729199 4628 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729204 4628 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729210 4628 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729215 4628 flags.go:64] FLAG: --experimental-mounter-path="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729220 4628 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729226 4628 flags.go:64] FLAG: --fail-swap-on="true" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729231 4628 flags.go:64] FLAG: --feature-gates="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729238 4628 flags.go:64] FLAG: --file-check-frequency="20s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729243 4628 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729250 4628 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729256 4628 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729261 4628 flags.go:64] FLAG: --healthz-port="10248" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729267 4628 flags.go:64] FLAG: --help="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729272 4628 flags.go:64] FLAG: --hostname-override="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729278 4628 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729283 4628 flags.go:64] FLAG: --http-check-frequency="20s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729289 4628 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729294 4628 flags.go:64] FLAG: --image-credential-provider-config="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729301 4628 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729307 4628 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729312 4628 flags.go:64] FLAG: --image-service-endpoint="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729318 4628 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729323 4628 flags.go:64] FLAG: --kube-api-burst="100" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729329 4628 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729336 4628 flags.go:64] FLAG: --kube-api-qps="50" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729341 4628 flags.go:64] FLAG: --kube-reserved="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729347 4628 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729352 4628 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729358 4628 flags.go:64] FLAG: --kubelet-cgroups="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729363 4628 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729369 4628 flags.go:64] FLAG: --lock-file="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729374 4628 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729379 4628 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729385 4628 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729393 4628 flags.go:64] FLAG: --log-json-split-stream="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729399 4628 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729404 4628 flags.go:64] FLAG: --log-text-split-stream="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729410 4628 flags.go:64] FLAG: --logging-format="text" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729415 4628 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729421 4628 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729426 4628 flags.go:64] FLAG: --manifest-url="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729432 4628 flags.go:64] FLAG: --manifest-url-header="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729439 4628 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729444 4628 flags.go:64] FLAG: --max-open-files="1000000" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729451 4628 flags.go:64] FLAG: --max-pods="110" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729458 4628 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729464 4628 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729469 4628 flags.go:64] FLAG: --memory-manager-policy="None" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729475 4628 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729480 4628 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729488 4628 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729494 4628 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729506 4628 flags.go:64] FLAG: --node-status-max-images="50" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729511 4628 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729517 4628 flags.go:64] FLAG: --oom-score-adj="-999" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729522 4628 flags.go:64] FLAG: --pod-cidr="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729528 4628 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729537 4628 flags.go:64] FLAG: --pod-manifest-path="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729542 4628 flags.go:64] FLAG: --pod-max-pids="-1" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729548 4628 flags.go:64] FLAG: --pods-per-core="0" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729554 4628 flags.go:64] FLAG: --port="10250" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729560 4628 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729565 4628 flags.go:64] FLAG: --provider-id="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729570 4628 flags.go:64] FLAG: --qos-reserved="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729576 4628 flags.go:64] FLAG: --read-only-port="10255" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729581 4628 flags.go:64] FLAG: --register-node="true" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729587 4628 flags.go:64] FLAG: --register-schedulable="true" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729592 4628 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729601 4628 flags.go:64] FLAG: --registry-burst="10" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729606 4628 flags.go:64] FLAG: --registry-qps="5" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729612 4628 flags.go:64] FLAG: --reserved-cpus="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729617 4628 flags.go:64] FLAG: --reserved-memory="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729625 4628 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729632 4628 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729638 4628 flags.go:64] FLAG: --rotate-certificates="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729644 4628 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729651 4628 flags.go:64] FLAG: --runonce="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729656 4628 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729662 4628 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729668 4628 flags.go:64] FLAG: --seccomp-default="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729673 4628 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729680 4628 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729686 4628 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729692 4628 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729698 4628 flags.go:64] FLAG: --storage-driver-password="root" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729703 4628 flags.go:64] FLAG: --storage-driver-secure="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729708 4628 flags.go:64] FLAG: --storage-driver-table="stats" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729714 4628 flags.go:64] FLAG: --storage-driver-user="root" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729719 4628 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729725 4628 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729731 4628 flags.go:64] FLAG: --system-cgroups="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729736 4628 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729744 4628 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729750 4628 flags.go:64] FLAG: --tls-cert-file="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729755 4628 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729762 4628 flags.go:64] FLAG: --tls-min-version="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729767 4628 flags.go:64] FLAG: --tls-private-key-file="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729772 4628 flags.go:64] FLAG: --topology-manager-policy="none" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729777 4628 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729783 4628 flags.go:64] FLAG: --topology-manager-scope="container" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729789 4628 flags.go:64] FLAG: --v="2" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729796 4628 flags.go:64] FLAG: --version="false" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729803 4628 flags.go:64] FLAG: --vmodule="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729809 4628 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.729816 4628 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.729969 4628 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.729977 4628 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.729984 4628 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.729991 4628 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.729996 4628 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730002 4628 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730007 4628 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730012 4628 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730018 4628 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730023 4628 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730029 4628 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730033 4628 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730039 4628 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730044 4628 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730049 4628 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730053 4628 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730058 4628 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730063 4628 feature_gate.go:330] unrecognized feature gate: Example Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730068 4628 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730073 4628 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730077 4628 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730082 4628 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730087 4628 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730092 4628 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730097 4628 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730101 4628 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730106 4628 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730111 4628 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730116 4628 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730121 4628 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730125 4628 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730130 4628 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730135 4628 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730140 4628 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730147 4628 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730152 4628 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730157 4628 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730162 4628 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730167 4628 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730173 4628 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730178 4628 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730183 4628 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730188 4628 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730192 4628 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730197 4628 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730202 4628 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730207 4628 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730211 4628 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730217 4628 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730222 4628 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730227 4628 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730233 4628 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730239 4628 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730245 4628 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730252 4628 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730257 4628 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730262 4628 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730267 4628 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730273 4628 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730278 4628 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730283 4628 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730289 4628 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730294 4628 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730299 4628 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730304 4628 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730310 4628 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730315 4628 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730320 4628 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730325 4628 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730331 4628 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.730335 4628 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.730487 4628 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.740233 4628 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.740295 4628 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740416 4628 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740427 4628 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740434 4628 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740439 4628 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740447 4628 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740453 4628 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740555 4628 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740562 4628 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740568 4628 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740574 4628 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740579 4628 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740584 4628 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740590 4628 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740625 4628 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740631 4628 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740638 4628 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740645 4628 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740651 4628 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740661 4628 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740671 4628 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740681 4628 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740693 4628 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740700 4628 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740707 4628 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740713 4628 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740721 4628 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740727 4628 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740738 4628 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740744 4628 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740751 4628 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740757 4628 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740766 4628 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740774 4628 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740779 4628 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740785 4628 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740792 4628 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740798 4628 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740805 4628 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740815 4628 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740829 4628 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740836 4628 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740881 4628 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740891 4628 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740898 4628 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740905 4628 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740912 4628 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740922 4628 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740928 4628 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740936 4628 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740946 4628 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740955 4628 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740963 4628 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740970 4628 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740976 4628 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740983 4628 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740988 4628 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.740994 4628 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741001 4628 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741007 4628 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741014 4628 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741020 4628 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741028 4628 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741037 4628 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741044 4628 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741054 4628 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741063 4628 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741070 4628 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741078 4628 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741084 4628 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741091 4628 feature_gate.go:330] unrecognized feature gate: Example Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741097 4628 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.741110 4628 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741314 4628 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741327 4628 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741334 4628 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741340 4628 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741345 4628 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741352 4628 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741357 4628 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741363 4628 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741369 4628 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741374 4628 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741382 4628 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741391 4628 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741398 4628 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741407 4628 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741414 4628 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741421 4628 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741428 4628 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741435 4628 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741442 4628 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741448 4628 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741454 4628 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741459 4628 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741466 4628 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741472 4628 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741478 4628 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741483 4628 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741489 4628 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741495 4628 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741501 4628 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741506 4628 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741514 4628 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741521 4628 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741527 4628 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741533 4628 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741539 4628 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741545 4628 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741551 4628 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741558 4628 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741563 4628 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741569 4628 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741574 4628 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741580 4628 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741585 4628 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741591 4628 feature_gate.go:330] unrecognized feature gate: Example Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741596 4628 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741601 4628 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741607 4628 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741612 4628 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741618 4628 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741623 4628 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741629 4628 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741634 4628 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741640 4628 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741647 4628 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741657 4628 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741664 4628 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741671 4628 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741680 4628 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741687 4628 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741693 4628 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741699 4628 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741706 4628 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741712 4628 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741717 4628 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741723 4628 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741728 4628 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741733 4628 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741740 4628 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741745 4628 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741750 4628 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.741756 4628 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.741766 4628 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.742394 4628 server.go:940] "Client rotation is on, will bootstrap in background" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.746640 4628 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.746776 4628 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.747416 4628 server.go:997] "Starting client certificate rotation" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.747440 4628 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.747643 4628 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-23 15:17:13.386495839 +0000 UTC Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.747726 4628 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.753329 4628 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 11 05:14:57 crc kubenswrapper[4628]: E1211 05:14:57.754806 4628 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.755454 4628 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.763391 4628 log.go:25] "Validated CRI v1 runtime API" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.780492 4628 log.go:25] "Validated CRI v1 image API" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.782531 4628 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.785198 4628 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-11-05-08-35-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.785229 4628 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.802065 4628 manager.go:217] Machine: {Timestamp:2025-12-11 05:14:57.800191538 +0000 UTC m=+0.217538276 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4d099854-d5b4-4b96-bdd8-2b0cb26202c0 BootID:e0c3d96d-e16a-42ff-9796-87bc8c4f15f0 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a6:5f:25 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a6:5f:25 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:51:af:dd Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:6e:ef:8f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ed:2d:b8 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a2:d6:f7 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:f6:8e:52:2f:e1:1a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fe:4a:95:07:26:88 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.802396 4628 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.802678 4628 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.803569 4628 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.803959 4628 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.804031 4628 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.804583 4628 topology_manager.go:138] "Creating topology manager with none policy" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.804608 4628 container_manager_linux.go:303] "Creating device plugin manager" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.804920 4628 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.804980 4628 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.805312 4628 state_mem.go:36] "Initialized new in-memory state store" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.805464 4628 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.806420 4628 kubelet.go:418] "Attempting to sync node with API server" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.806459 4628 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.806502 4628 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.806525 4628 kubelet.go:324] "Adding apiserver pod source" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.806549 4628 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.808566 4628 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.808602 4628 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 05:14:57 crc kubenswrapper[4628]: E1211 05:14:57.808733 4628 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 05:14:57 crc kubenswrapper[4628]: E1211 05:14:57.808741 4628 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.809495 4628 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.809880 4628 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.810569 4628 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811157 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811181 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811190 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811199 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811210 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811218 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811225 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811235 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811244 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811251 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811265 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811273 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.811442 4628 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.812004 4628 server.go:1280] "Started kubelet" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.812322 4628 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.812370 4628 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.812842 4628 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 05:14:57 crc systemd[1]: Started Kubernetes Kubelet. Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.814172 4628 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.815116 4628 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.815167 4628 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.815302 4628 server.go:460] "Adding debug handlers to kubelet server" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.815196 4628 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 05:39:23.207827003 +0000 UTC Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.815339 4628 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 384h24m25.392494065s for next certificate rotation Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.815776 4628 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.815798 4628 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 11 05:14:57 crc kubenswrapper[4628]: E1211 05:14:57.815935 4628 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.816049 4628 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 11 05:14:57 crc kubenswrapper[4628]: E1211 05:14:57.816590 4628 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="200ms" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.817088 4628 factory.go:55] Registering systemd factory Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.817121 4628 factory.go:221] Registration of the systemd container factory successfully Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.817288 4628 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 05:14:57 crc kubenswrapper[4628]: E1211 05:14:57.817381 4628 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.817666 4628 factory.go:153] Registering CRI-O factory Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.817720 4628 factory.go:221] Registration of the crio container factory successfully Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.817912 4628 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.818414 4628 factory.go:103] Registering Raw factory Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.818614 4628 manager.go:1196] Started watching for new ooms in manager Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.819995 4628 manager.go:319] Starting recovery of all containers Dec 11 05:14:57 crc kubenswrapper[4628]: E1211 05:14:57.823749 4628 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.18:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18801151ffdd8ceb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-11 05:14:57.811967211 +0000 UTC m=+0.229313909,LastTimestamp:2025-12-11 05:14:57.811967211 +0000 UTC m=+0.229313909,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846103 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846202 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846222 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846245 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846260 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846274 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846290 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846305 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846321 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846335 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846351 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846367 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846382 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846400 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846414 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846433 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846450 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846468 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846483 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846497 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846514 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846529 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846544 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846562 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846579 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846631 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846650 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846666 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846707 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846723 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846738 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846754 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846769 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846814 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846874 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846890 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846905 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846921 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846935 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846949 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846962 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.846975 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847208 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847231 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847245 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847259 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847274 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847292 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847311 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847340 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847356 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847371 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847392 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847409 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847423 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847438 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847453 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847480 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847494 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847508 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847523 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847537 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847553 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847570 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847585 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847637 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847659 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847734 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847751 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847768 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847782 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847796 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847812 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847826 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847840 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847961 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847978 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.847993 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848007 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848022 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848037 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848054 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848068 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848086 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848100 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848117 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848130 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848143 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848158 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848174 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848189 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848204 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848221 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848237 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848254 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848269 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848285 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848303 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848318 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848333 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848350 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848368 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848384 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848400 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848430 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848447 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848464 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848479 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848493 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848509 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848526 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848540 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848557 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848572 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848589 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848608 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848627 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848644 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848663 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848679 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848694 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848733 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848748 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848762 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848775 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848789 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848803 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848817 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848832 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848867 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848882 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848897 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848911 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848926 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848944 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848959 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848975 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.848990 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849004 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849021 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849035 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849049 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849063 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849078 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849092 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849108 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849124 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849139 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849152 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849166 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849179 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849193 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849207 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849223 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849239 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849255 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849270 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849285 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849300 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849313 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849329 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849642 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849661 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849681 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849700 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849720 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849741 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849761 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849777 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849798 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849816 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849835 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849879 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849899 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849912 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849927 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849974 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.849994 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.850012 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.850031 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.850050 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.850069 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.850089 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.850975 4628 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851009 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851026 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851044 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851059 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851074 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851089 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851104 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851120 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851136 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851154 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851170 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851186 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851200 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851214 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851230 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851245 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851261 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851281 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851296 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851310 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851327 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851341 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851354 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851368 4628 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851382 4628 reconstruct.go:97] "Volume reconstruction finished" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.851430 4628 reconciler.go:26] "Reconciler: start to sync state" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.864813 4628 manager.go:324] Recovery completed Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.875222 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.877404 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.877446 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.877455 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.878663 4628 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.878679 4628 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.878700 4628 state_mem.go:36] "Initialized new in-memory state store" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.886155 4628 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.888122 4628 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.888161 4628 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.888190 4628 kubelet.go:2335] "Starting kubelet main sync loop" Dec 11 05:14:57 crc kubenswrapper[4628]: E1211 05:14:57.888233 4628 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 11 05:14:57 crc kubenswrapper[4628]: W1211 05:14:57.888917 4628 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 05:14:57 crc kubenswrapper[4628]: E1211 05:14:57.888966 4628 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.907265 4628 policy_none.go:49] "None policy: Start" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.908396 4628 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.908427 4628 state_mem.go:35] "Initializing new in-memory state store" Dec 11 05:14:57 crc kubenswrapper[4628]: E1211 05:14:57.917279 4628 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.950958 4628 manager.go:334] "Starting Device Plugin manager" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.951016 4628 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.951029 4628 server.go:79] "Starting device plugin registration server" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.951547 4628 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.951567 4628 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.951863 4628 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.951941 4628 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.951952 4628 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 11 05:14:57 crc kubenswrapper[4628]: E1211 05:14:57.962794 4628 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.989251 4628 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.989352 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.991306 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.991378 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.991392 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.991719 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.992403 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.993043 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.993702 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.993735 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.993746 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.993883 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.993970 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.994010 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.994196 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.994234 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.994245 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.995382 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.995428 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.995443 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.995636 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.995869 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.995944 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.996331 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.996371 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.996385 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.997083 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.997160 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.997172 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.997557 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.997579 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.997587 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.997723 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.997887 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.997944 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.998672 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.998719 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.998738 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.999014 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.999060 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.999073 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.999091 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:57 crc kubenswrapper[4628]: I1211 05:14:57.999100 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.000091 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.000120 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.000133 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:58 crc kubenswrapper[4628]: E1211 05:14:58.017454 4628 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="400ms" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.052057 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.053255 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.053299 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.053319 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.053350 4628 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 05:14:58 crc kubenswrapper[4628]: E1211 05:14:58.054183 4628 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.18:6443: connect: connection refused" node="crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.054223 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.054563 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.054725 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.054865 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.054991 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.055102 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.055240 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.055365 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.055485 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.055599 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.055706 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.055863 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.055978 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.056092 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.056218 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157302 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157352 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157371 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157389 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157403 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157420 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157439 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157455 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157473 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157489 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157507 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157531 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157547 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157562 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157577 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.157998 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158042 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158067 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158085 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158091 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158119 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158104 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158140 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158162 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158169 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158187 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158219 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158226 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158242 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.158279 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.255467 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.257165 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.257228 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.257247 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.257282 4628 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 05:14:58 crc kubenswrapper[4628]: E1211 05:14:58.258340 4628 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.18:6443: connect: connection refused" node="crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.331545 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.340439 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: W1211 05:14:58.360965 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-04d0f457bf2321a493f2f5cad2b8a05fc7649583e414171876fafe5b45009bd7 WatchSource:0}: Error finding container 04d0f457bf2321a493f2f5cad2b8a05fc7649583e414171876fafe5b45009bd7: Status 404 returned error can't find the container with id 04d0f457bf2321a493f2f5cad2b8a05fc7649583e414171876fafe5b45009bd7 Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.362368 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: W1211 05:14:58.366739 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e670eb7cd360179b7e9000b382adb7a66fe5a4d8db0560661508d420a35a9cf1 WatchSource:0}: Error finding container e670eb7cd360179b7e9000b382adb7a66fe5a4d8db0560661508d420a35a9cf1: Status 404 returned error can't find the container with id e670eb7cd360179b7e9000b382adb7a66fe5a4d8db0560661508d420a35a9cf1 Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.382674 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.391805 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:14:58 crc kubenswrapper[4628]: E1211 05:14:58.418692 4628 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="800ms" Dec 11 05:14:58 crc kubenswrapper[4628]: W1211 05:14:58.542975 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-4b228939359173de3f575fed4f17405942462748b1829a9cfa6ed7bf3c08cc62 WatchSource:0}: Error finding container 4b228939359173de3f575fed4f17405942462748b1829a9cfa6ed7bf3c08cc62: Status 404 returned error can't find the container with id 4b228939359173de3f575fed4f17405942462748b1829a9cfa6ed7bf3c08cc62 Dec 11 05:14:58 crc kubenswrapper[4628]: W1211 05:14:58.563081 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-eb5fc82a0b7fd0d77891177923929f4477450b771223f37faf14a0286c3f51f8 WatchSource:0}: Error finding container eb5fc82a0b7fd0d77891177923929f4477450b771223f37faf14a0286c3f51f8: Status 404 returned error can't find the container with id eb5fc82a0b7fd0d77891177923929f4477450b771223f37faf14a0286c3f51f8 Dec 11 05:14:58 crc kubenswrapper[4628]: W1211 05:14:58.566218 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7942b0f06d1bd3757c52b287e932ecd9503ceafcfdaa3272ce2fb8355b5d6504 WatchSource:0}: Error finding container 7942b0f06d1bd3757c52b287e932ecd9503ceafcfdaa3272ce2fb8355b5d6504: Status 404 returned error can't find the container with id 7942b0f06d1bd3757c52b287e932ecd9503ceafcfdaa3272ce2fb8355b5d6504 Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.658607 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.660218 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.660266 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.660278 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.660310 4628 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 05:14:58 crc kubenswrapper[4628]: E1211 05:14:58.660818 4628 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.18:6443: connect: connection refused" node="crc" Dec 11 05:14:58 crc kubenswrapper[4628]: W1211 05:14:58.784338 4628 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 05:14:58 crc kubenswrapper[4628]: E1211 05:14:58.784448 4628 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 05:14:58 crc kubenswrapper[4628]: W1211 05:14:58.807434 4628 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 05:14:58 crc kubenswrapper[4628]: E1211 05:14:58.807522 4628 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.814171 4628 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.894916 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7942b0f06d1bd3757c52b287e932ecd9503ceafcfdaa3272ce2fb8355b5d6504"} Dec 11 05:14:58 crc kubenswrapper[4628]: W1211 05:14:58.905120 4628 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 05:14:58 crc kubenswrapper[4628]: E1211 05:14:58.905192 4628 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.933379 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"eb5fc82a0b7fd0d77891177923929f4477450b771223f37faf14a0286c3f51f8"} Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.935691 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4b228939359173de3f575fed4f17405942462748b1829a9cfa6ed7bf3c08cc62"} Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.936934 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e670eb7cd360179b7e9000b382adb7a66fe5a4d8db0560661508d420a35a9cf1"} Dec 11 05:14:58 crc kubenswrapper[4628]: I1211 05:14:58.937910 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"04d0f457bf2321a493f2f5cad2b8a05fc7649583e414171876fafe5b45009bd7"} Dec 11 05:14:59 crc kubenswrapper[4628]: E1211 05:14:59.219941 4628 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="1.6s" Dec 11 05:14:59 crc kubenswrapper[4628]: W1211 05:14:59.415449 4628 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 05:14:59 crc kubenswrapper[4628]: E1211 05:14:59.415597 4628 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.461587 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.463379 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.463452 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.463475 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.463524 4628 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 05:14:59 crc kubenswrapper[4628]: E1211 05:14:59.464215 4628 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.18:6443: connect: connection refused" node="crc" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.813598 4628 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.18:6443: connect: connection refused Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.841894 4628 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 11 05:14:59 crc kubenswrapper[4628]: E1211 05:14:59.842834 4628 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.18:6443: connect: connection refused" logger="UnhandledError" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.942254 4628 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="26b94038c88bf91de8f2a022801caae70c87dc2f900249951e69c2cf4879e156" exitCode=0 Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.942363 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"26b94038c88bf91de8f2a022801caae70c87dc2f900249951e69c2cf4879e156"} Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.942380 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.943471 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.943507 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.943522 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.944947 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"48ffe9d7b0556b96825e3e58a0b7ccd2a5ae51ed64f3ad06762a745c75b779de"} Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.944973 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5f8590a37fc6bbfe17efbfb9d1d02d0d8210ce54303f2e691b5adafb771bed9b"} Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.944983 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c1e544f479d3ab726765f5a5b361070e9f87062533a676c46064b447c9469eb5"} Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.946530 4628 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b" exitCode=0 Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.946592 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b"} Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.946697 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.947789 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.947815 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.947825 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.948244 4628 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c014c5e16aead86a9f796eb55aacf053ff5cd67b7686c7c9d7c3f94441ce925f" exitCode=0 Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.948305 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c014c5e16aead86a9f796eb55aacf053ff5cd67b7686c7c9d7c3f94441ce925f"} Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.948403 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.949058 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.949610 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.949642 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.949645 4628 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="c7c5e5171be53abafdcf277bb56ce5d48b2ba27f4a778eac83cacabc6af62957" exitCode=0 Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.949658 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.949670 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"c7c5e5171be53abafdcf277bb56ce5d48b2ba27f4a778eac83cacabc6af62957"} Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.951543 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.951543 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.952080 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.952096 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.953530 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.953559 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:14:59 crc kubenswrapper[4628]: I1211 05:14:59.953569 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.956374 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a04405d2b3f177dfbc4b04e9d823a69508574940e43547466dfc75a421f65789"} Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.956454 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.957347 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.957371 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.957380 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.960037 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453"} Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.960067 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6"} Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.960077 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a"} Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.960086 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0"} Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.961475 4628 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5e59dc229bea702e93e5c4a5eea985fc1b5edc0868f2d427f9402d9dec75d7d3" exitCode=0 Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.961522 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5e59dc229bea702e93e5c4a5eea985fc1b5edc0868f2d427f9402d9dec75d7d3"} Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.961606 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.962289 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.962307 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.962317 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.966740 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.966736 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c3156dc7d63066f74a73b1bb28a11c25f3f0d82120ae6e76440280b28de5cdd3"} Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.967421 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.968314 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.968331 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.973324 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e9634651a9b38fbe999b60d48d6a5d5e7921bb7da956d3ab8c0e61bab72f7578"} Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.973358 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4879dca1301f1591080241d1c5990671f799baf7d37cdb33fd8ef3247af83f7e"} Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.973369 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"318582c96fe131b871dd8cabd56da48e5ca0d3ec1266c03e01fbaf2fbd8a2a28"} Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.973417 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.976260 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.976319 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:00 crc kubenswrapper[4628]: I1211 05:15:00.976357 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.065368 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.070339 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.070386 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.070399 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.070433 4628 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.982324 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.982315 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"86eafba1edb23013c7f70c5182bc61fd6af5e475a6b40b143dbf567b504b8bd1"} Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.983733 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.983795 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.983813 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.986272 4628 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="58e590008a3962ef7be8a4fa4b2c699739e9cb77a35c48b163e2380e0db07ccc" exitCode=0 Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.986425 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"58e590008a3962ef7be8a4fa4b2c699739e9cb77a35c48b163e2380e0db07ccc"} Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.986452 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.986487 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.986522 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.986559 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.986971 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.987979 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.988018 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.988035 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.988126 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.988171 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.988240 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.988881 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.988933 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.988951 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.989232 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.989273 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:01 crc kubenswrapper[4628]: I1211 05:15:01.989291 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:02 crc kubenswrapper[4628]: I1211 05:15:02.989896 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:15:02 crc kubenswrapper[4628]: I1211 05:15:02.989981 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:02 crc kubenswrapper[4628]: I1211 05:15:02.991358 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:02 crc kubenswrapper[4628]: I1211 05:15:02.991426 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:02 crc kubenswrapper[4628]: I1211 05:15:02.991445 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:03 crc kubenswrapper[4628]: I1211 05:15:03.848384 4628 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 11 05:15:03 crc kubenswrapper[4628]: I1211 05:15:03.997932 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f70d394ef47a39c4925a101ca9a5bedf1f74846f06010142c48e3a3c856d903b"} Dec 11 05:15:04 crc kubenswrapper[4628]: I1211 05:15:04.413908 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 05:15:04 crc kubenswrapper[4628]: I1211 05:15:04.414110 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:04 crc kubenswrapper[4628]: I1211 05:15:04.415157 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:04 crc kubenswrapper[4628]: I1211 05:15:04.415203 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:04 crc kubenswrapper[4628]: I1211 05:15:04.415228 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:04 crc kubenswrapper[4628]: I1211 05:15:04.712863 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:15:04 crc kubenswrapper[4628]: I1211 05:15:04.713012 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:15:04 crc kubenswrapper[4628]: I1211 05:15:04.713055 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:04 crc kubenswrapper[4628]: I1211 05:15:04.714327 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:04 crc kubenswrapper[4628]: I1211 05:15:04.714368 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:04 crc kubenswrapper[4628]: I1211 05:15:04.714382 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.005896 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"563616f333f210646ffd0cc633f0634a7931773dbcf7fc6f0e156c13ab74b947"} Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.005945 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"29269149297db0baedc9a33de7f0bdf77e0d3a1e5ad5bac90eb05874a8ea3741"} Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.005958 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e238d6e9e7cbc3b07afb2dd6d67be1693aeb98ff570f6834c3dc6fe57622f889"} Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.097583 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.097788 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.097874 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.099597 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.099694 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.099715 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.534889 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.535086 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.536368 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.536397 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.536409 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.543746 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.614359 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:15:05 crc kubenswrapper[4628]: I1211 05:15:05.703019 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.018776 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c145622ba42e9a6daf074357291e37a20dfcc6e4a5a004333f984265fa63c4ca"} Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.018961 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.018984 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.019013 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.019104 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.020582 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.020638 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.020664 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.020700 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.020746 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.020767 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.020917 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.020957 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:06 crc kubenswrapper[4628]: I1211 05:15:06.020974 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:07 crc kubenswrapper[4628]: I1211 05:15:07.024348 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:07 crc kubenswrapper[4628]: I1211 05:15:07.024419 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:07 crc kubenswrapper[4628]: I1211 05:15:07.025935 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:07 crc kubenswrapper[4628]: I1211 05:15:07.026000 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:07 crc kubenswrapper[4628]: I1211 05:15:07.026014 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:07 crc kubenswrapper[4628]: I1211 05:15:07.026024 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:07 crc kubenswrapper[4628]: I1211 05:15:07.026049 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:07 crc kubenswrapper[4628]: I1211 05:15:07.026193 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:07 crc kubenswrapper[4628]: E1211 05:15:07.963077 4628 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 11 05:15:08 crc kubenswrapper[4628]: I1211 05:15:08.034980 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 11 05:15:08 crc kubenswrapper[4628]: I1211 05:15:08.035217 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:08 crc kubenswrapper[4628]: I1211 05:15:08.036431 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:08 crc kubenswrapper[4628]: I1211 05:15:08.036498 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:08 crc kubenswrapper[4628]: I1211 05:15:08.036520 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:08 crc kubenswrapper[4628]: I1211 05:15:08.401985 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 11 05:15:09 crc kubenswrapper[4628]: I1211 05:15:09.029475 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:09 crc kubenswrapper[4628]: I1211 05:15:09.030725 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:09 crc kubenswrapper[4628]: I1211 05:15:09.030923 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:09 crc kubenswrapper[4628]: I1211 05:15:09.031107 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:09 crc kubenswrapper[4628]: I1211 05:15:09.584961 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:15:09 crc kubenswrapper[4628]: I1211 05:15:09.585192 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:09 crc kubenswrapper[4628]: I1211 05:15:09.587292 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:09 crc kubenswrapper[4628]: I1211 05:15:09.587369 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:09 crc kubenswrapper[4628]: I1211 05:15:09.587395 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:09 crc kubenswrapper[4628]: I1211 05:15:09.592428 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:15:10 crc kubenswrapper[4628]: I1211 05:15:10.033309 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:10 crc kubenswrapper[4628]: I1211 05:15:10.034665 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:10 crc kubenswrapper[4628]: I1211 05:15:10.034751 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:10 crc kubenswrapper[4628]: I1211 05:15:10.034774 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:10 crc kubenswrapper[4628]: I1211 05:15:10.814305 4628 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 11 05:15:10 crc kubenswrapper[4628]: E1211 05:15:10.821044 4628 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 11 05:15:10 crc kubenswrapper[4628]: W1211 05:15:10.846691 4628 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 11 05:15:10 crc kubenswrapper[4628]: I1211 05:15:10.846818 4628 trace.go:236] Trace[654129759]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Dec-2025 05:15:00.845) (total time: 10000ms): Dec 11 05:15:10 crc kubenswrapper[4628]: Trace[654129759]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (05:15:10.846) Dec 11 05:15:10 crc kubenswrapper[4628]: Trace[654129759]: [10.000979704s] [10.000979704s] END Dec 11 05:15:10 crc kubenswrapper[4628]: E1211 05:15:10.846889 4628 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 11 05:15:10 crc kubenswrapper[4628]: W1211 05:15:10.952336 4628 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 11 05:15:10 crc kubenswrapper[4628]: I1211 05:15:10.952488 4628 trace.go:236] Trace[1488143576]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Dec-2025 05:15:00.950) (total time: 10001ms): Dec 11 05:15:10 crc kubenswrapper[4628]: Trace[1488143576]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:15:10.952) Dec 11 05:15:10 crc kubenswrapper[4628]: Trace[1488143576]: [10.001903632s] [10.001903632s] END Dec 11 05:15:10 crc kubenswrapper[4628]: E1211 05:15:10.952520 4628 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 11 05:15:11 crc kubenswrapper[4628]: E1211 05:15:11.071790 4628 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 11 05:15:11 crc kubenswrapper[4628]: W1211 05:15:11.815215 4628 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 11 05:15:11 crc kubenswrapper[4628]: I1211 05:15:11.815305 4628 trace.go:236] Trace[467716394]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Dec-2025 05:15:01.813) (total time: 10001ms): Dec 11 05:15:11 crc kubenswrapper[4628]: Trace[467716394]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:15:11.815) Dec 11 05:15:11 crc kubenswrapper[4628]: Trace[467716394]: [10.001316826s] [10.001316826s] END Dec 11 05:15:11 crc kubenswrapper[4628]: E1211 05:15:11.815328 4628 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 11 05:15:11 crc kubenswrapper[4628]: W1211 05:15:11.997829 4628 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 11 05:15:11 crc kubenswrapper[4628]: I1211 05:15:11.998032 4628 trace.go:236] Trace[958454944]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Dec-2025 05:15:01.996) (total time: 10001ms): Dec 11 05:15:11 crc kubenswrapper[4628]: Trace[958454944]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (05:15:11.997) Dec 11 05:15:11 crc kubenswrapper[4628]: Trace[958454944]: [10.001898971s] [10.001898971s] END Dec 11 05:15:11 crc kubenswrapper[4628]: E1211 05:15:11.998070 4628 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 11 05:15:12 crc kubenswrapper[4628]: I1211 05:15:12.584977 4628 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 05:15:12 crc kubenswrapper[4628]: I1211 05:15:12.585093 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 05:15:12 crc kubenswrapper[4628]: E1211 05:15:12.821599 4628 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.18801151ffdd8ceb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-11 05:14:57.811967211 +0000 UTC m=+0.229313909,LastTimestamp:2025-12-11 05:14:57.811967211 +0000 UTC m=+0.229313909,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 11 05:15:13 crc kubenswrapper[4628]: E1211 05:15:13.850278 4628 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 11 05:15:14 crc kubenswrapper[4628]: I1211 05:15:14.273008 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:14 crc kubenswrapper[4628]: I1211 05:15:14.275082 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:14 crc kubenswrapper[4628]: I1211 05:15:14.275163 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:14 crc kubenswrapper[4628]: I1211 05:15:14.275191 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:14 crc kubenswrapper[4628]: I1211 05:15:14.275255 4628 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 05:15:15 crc kubenswrapper[4628]: I1211 05:15:15.098680 4628 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" start-of-body= Dec 11 05:15:15 crc kubenswrapper[4628]: I1211 05:15:15.098812 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" Dec 11 05:15:17 crc kubenswrapper[4628]: E1211 05:15:17.963507 4628 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 11 05:15:18 crc kubenswrapper[4628]: I1211 05:15:18.068557 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 11 05:15:18 crc kubenswrapper[4628]: I1211 05:15:18.068823 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:18 crc kubenswrapper[4628]: I1211 05:15:18.070158 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:18 crc kubenswrapper[4628]: I1211 05:15:18.070227 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:18 crc kubenswrapper[4628]: I1211 05:15:18.070241 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:18 crc kubenswrapper[4628]: I1211 05:15:18.081800 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 11 05:15:18 crc kubenswrapper[4628]: I1211 05:15:18.767969 4628 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 11 05:15:18 crc kubenswrapper[4628]: I1211 05:15:18.768106 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 11 05:15:19 crc kubenswrapper[4628]: I1211 05:15:19.059564 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:19 crc kubenswrapper[4628]: I1211 05:15:19.060793 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:19 crc kubenswrapper[4628]: I1211 05:15:19.060873 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:19 crc kubenswrapper[4628]: I1211 05:15:19.060885 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:20 crc kubenswrapper[4628]: I1211 05:15:20.103527 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:15:20 crc kubenswrapper[4628]: I1211 05:15:20.103790 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:20 crc kubenswrapper[4628]: I1211 05:15:20.105461 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:20 crc kubenswrapper[4628]: I1211 05:15:20.105519 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:20 crc kubenswrapper[4628]: I1211 05:15:20.105536 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:20 crc kubenswrapper[4628]: I1211 05:15:20.108164 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:15:21 crc kubenswrapper[4628]: I1211 05:15:21.064518 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:15:21 crc kubenswrapper[4628]: I1211 05:15:21.064579 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:21 crc kubenswrapper[4628]: I1211 05:15:21.065435 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:21 crc kubenswrapper[4628]: I1211 05:15:21.065478 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:21 crc kubenswrapper[4628]: I1211 05:15:21.065487 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:22 crc kubenswrapper[4628]: I1211 05:15:22.298110 4628 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": context deadline exceeded" start-of-body= Dec 11 05:15:22 crc kubenswrapper[4628]: I1211 05:15:22.298249 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": context deadline exceeded" Dec 11 05:15:22 crc kubenswrapper[4628]: I1211 05:15:22.361615 4628 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 11 05:15:22 crc kubenswrapper[4628]: I1211 05:15:22.384661 4628 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 11 05:15:22 crc kubenswrapper[4628]: I1211 05:15:22.585594 4628 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 05:15:22 crc kubenswrapper[4628]: I1211 05:15:22.585831 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 11 05:15:23 crc kubenswrapper[4628]: I1211 05:15:23.756709 4628 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 11 05:15:23 crc kubenswrapper[4628]: E1211 05:15:23.760319 4628 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 11 05:15:23 crc kubenswrapper[4628]: I1211 05:15:23.760476 4628 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 11 05:15:23 crc kubenswrapper[4628]: I1211 05:15:23.760481 4628 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 11 05:15:23 crc kubenswrapper[4628]: I1211 05:15:23.760985 4628 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 11 05:15:23 crc kubenswrapper[4628]: I1211 05:15:23.763385 4628 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 11 05:15:23 crc kubenswrapper[4628]: I1211 05:15:23.810134 4628 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34852->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 11 05:15:23 crc kubenswrapper[4628]: I1211 05:15:23.810460 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34852->192.168.126.11:17697: read: connection reset by peer" Dec 11 05:15:23 crc kubenswrapper[4628]: I1211 05:15:23.810137 4628 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34866->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 11 05:15:23 crc kubenswrapper[4628]: I1211 05:15:23.812947 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34866->192.168.126.11:17697: read: connection reset by peer" Dec 11 05:15:23 crc kubenswrapper[4628]: I1211 05:15:23.813278 4628 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 11 05:15:23 crc kubenswrapper[4628]: I1211 05:15:23.813364 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 11 05:15:23 crc kubenswrapper[4628]: I1211 05:15:23.823118 4628 apiserver.go:52] "Watching apiserver" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.119623 4628 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.120000 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.120404 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.120970 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.120516 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.121279 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.121499 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.121787 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.121942 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.122054 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.122412 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.138341 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.139092 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.139419 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.139235 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.139268 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.139340 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.141469 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.141479 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.141695 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.173074 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.186757 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.199425 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.210129 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.217397 4628 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.222901 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.232427 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.255742 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.263687 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.263725 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.263745 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.263761 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.263796 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.263811 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.263826 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.263855 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.263869 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264089 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264122 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264089 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264251 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264277 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264290 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264317 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264340 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264492 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264355 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264536 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264712 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264748 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264765 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264838 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264857 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264937 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264778 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264977 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.264993 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265112 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265230 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265344 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265504 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265536 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265732 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265742 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265786 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265807 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265823 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265839 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265868 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265883 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265899 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265913 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265930 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265946 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265962 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265979 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.265995 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266010 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266025 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266042 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266059 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266246 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266299 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266393 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266522 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266077 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266601 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266630 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266649 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266663 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266679 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266693 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266708 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266728 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266743 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266757 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266772 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266786 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266806 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266820 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266834 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266864 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266880 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266894 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266908 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266923 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266939 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266954 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266969 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266985 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.266999 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267012 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267028 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267044 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267062 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267076 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267092 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267107 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267122 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267138 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267152 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267167 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267182 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267197 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267213 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267227 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267242 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267256 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267272 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267286 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267301 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267316 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267909 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267930 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267946 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267960 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267977 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.267993 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268008 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268023 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268042 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268058 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268073 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268088 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268103 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268120 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268135 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268150 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268166 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268181 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268197 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268212 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268228 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268244 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268258 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268272 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268288 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268302 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268318 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268333 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268347 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268363 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268383 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268398 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268413 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268428 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268444 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268459 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268475 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268490 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268505 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268521 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268538 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268554 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268569 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268584 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268600 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268615 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268630 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268644 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268660 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268675 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268692 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268728 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268744 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268759 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268775 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268954 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268971 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.268988 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.269003 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.269018 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.269034 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.269050 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.269066 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.269228 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.269643 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.269801 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.269919 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.269943 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.269961 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.269982 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270001 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270025 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270044 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270063 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270082 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270102 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270124 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270144 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270161 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270176 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270193 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270208 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270226 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270243 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270258 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270275 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270290 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270305 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270323 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270338 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270355 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270371 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270388 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270404 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270419 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270435 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270451 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270466 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270482 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270545 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270563 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270579 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270595 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270612 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270628 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270643 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270659 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270674 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270690 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270707 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270722 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270738 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270754 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270771 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270792 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270808 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270857 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270878 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270896 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270913 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270932 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270971 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.270988 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.271009 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.271028 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.271044 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.271066 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.271082 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.271099 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.271117 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272815 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272834 4628 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272860 4628 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272871 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272883 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272893 4628 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272903 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272913 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272922 4628 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272932 4628 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272942 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272951 4628 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272961 4628 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272972 4628 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272982 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.272992 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273001 4628 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273010 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273020 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273029 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273039 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273048 4628 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273058 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273069 4628 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273078 4628 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273114 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273136 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273307 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273470 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273492 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273655 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273666 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.273824 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.274009 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.274141 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.274829 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.275211 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.275347 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.275529 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.275854 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.276065 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.276201 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.277054 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.277057 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.277262 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.277326 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.277641 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.277929 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.278156 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.278221 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.278440 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.278480 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.278519 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.278626 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.278816 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.279080 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.279407 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.279105 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.279642 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.280220 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.280548 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.281094 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.281181 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.281634 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.281630 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.281816 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.282003 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.282008 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.282266 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.281498 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.282475 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.282493 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.283209 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.283439 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.283681 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.283810 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.284020 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.284103 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.284408 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.284644 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.284942 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.284960 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.284986 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.285033 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.285082 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.285128 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.285791 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.285859 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.286047 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.286235 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.288114 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.288355 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.288392 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.288654 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.288814 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.289768 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.289830 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.289993 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.290146 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.290622 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.290731 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.290978 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.291221 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.291587 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.292367 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.292773 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.293337 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.293584 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.293655 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.293914 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.294134 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.294463 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.294867 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.295249 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.295547 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.295880 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.296174 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.296220 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.296349 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.297140 4628 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.297218 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.297262 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.297279 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.297448 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.297526 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.298011 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.298045 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.298481 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.298586 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.298602 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.298772 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.298956 4628 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.300225 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:24.800207506 +0000 UTC m=+27.217554204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.300441 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.299576 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.299131 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.299240 4628 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.300553 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.300580 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:24.800571156 +0000 UTC m=+27.217917854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.300684 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.300805 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.300715 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.301040 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.299377 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.299526 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.299695 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.300103 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.301145 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.299266 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.301416 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.301486 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.301499 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.301501 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.301568 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.301592 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.301607 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.301670 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.302028 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.302281 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:24.8022707 +0000 UTC m=+27.219617398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.302559 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.302614 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.302998 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.303417 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.303797 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.304358 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.304408 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.304421 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.304451 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.304565 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.304681 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.304810 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.304881 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.304904 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.305036 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.305060 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.305231 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.305251 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.304201 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.305357 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.305641 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.305713 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.305761 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.305991 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.306106 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.306404 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.306640 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.306723 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.307155 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.307169 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.307000 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.307311 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.308756 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.309013 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.309036 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.309395 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.309759 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.310030 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.312086 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.314568 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.315004 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.315025 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.315038 4628 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.315099 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:24.815073282 +0000 UTC m=+27.232419970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.316641 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.316760 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.317104 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.317410 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.319457 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.322667 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.323244 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.323345 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.323419 4628 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.323506 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:24.823492077 +0000 UTC m=+27.240838775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.335409 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.336429 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.344500 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.351043 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373341 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373377 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373443 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373454 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373463 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373471 4628 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373479 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373488 4628 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373497 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373506 4628 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373514 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373523 4628 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373533 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373541 4628 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373550 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373591 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373600 4628 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373609 4628 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373618 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373627 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373636 4628 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373645 4628 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373655 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373665 4628 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373674 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373683 4628 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373693 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373701 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373710 4628 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373720 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373729 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373740 4628 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373749 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373758 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373768 4628 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373776 4628 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373784 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373793 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373802 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373802 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373812 4628 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373832 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373856 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373866 4628 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373876 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373885 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373893 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373902 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373910 4628 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373919 4628 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373927 4628 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373935 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373943 4628 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373950 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373958 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373966 4628 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373974 4628 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373981 4628 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373990 4628 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.373998 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374005 4628 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374013 4628 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374021 4628 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374028 4628 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374036 4628 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374044 4628 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374052 4628 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374060 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374067 4628 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374075 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374083 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374091 4628 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374098 4628 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374106 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374114 4628 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374122 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374129 4628 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374137 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374144 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374153 4628 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374161 4628 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374169 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374177 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374185 4628 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374193 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374201 4628 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374209 4628 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374217 4628 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374225 4628 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374232 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374244 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374256 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374264 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374272 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374279 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374287 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374296 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374304 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374312 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374319 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374328 4628 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374336 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374343 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374351 4628 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374359 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374367 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374375 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374384 4628 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374392 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374401 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374410 4628 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374419 4628 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374427 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374435 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374444 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374452 4628 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374460 4628 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374467 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374474 4628 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374482 4628 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374489 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374497 4628 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374507 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374515 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374524 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374532 4628 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374541 4628 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374551 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374560 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374569 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374576 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374585 4628 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374593 4628 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374600 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374607 4628 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374615 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374623 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374631 4628 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374638 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374646 4628 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374655 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374662 4628 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374670 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374677 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374685 4628 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374693 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374701 4628 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374708 4628 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374717 4628 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374724 4628 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374732 4628 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374740 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374747 4628 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374755 4628 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374763 4628 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374771 4628 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374786 4628 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374794 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374801 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374810 4628 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374817 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374825 4628 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374833 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374854 4628 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374862 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374869 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374877 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374885 4628 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374893 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374902 4628 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374910 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374918 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.374926 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.451879 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.459438 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.467074 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 11 05:15:24 crc kubenswrapper[4628]: W1211 05:15:24.467491 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-bd6bc8a48d24e46f07d8eb9bc21b9cbb3c932677e303444391f2f6c61478ab7d WatchSource:0}: Error finding container bd6bc8a48d24e46f07d8eb9bc21b9cbb3c932677e303444391f2f6c61478ab7d: Status 404 returned error can't find the container with id bd6bc8a48d24e46f07d8eb9bc21b9cbb3c932677e303444391f2f6c61478ab7d Dec 11 05:15:24 crc kubenswrapper[4628]: W1211 05:15:24.471307 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-7e3335fa28759c64235a640c9019f0b8d03e5b780cca2dfb8bbe44265775495a WatchSource:0}: Error finding container 7e3335fa28759c64235a640c9019f0b8d03e5b780cca2dfb8bbe44265775495a: Status 404 returned error can't find the container with id 7e3335fa28759c64235a640c9019f0b8d03e5b780cca2dfb8bbe44265775495a Dec 11 05:15:24 crc kubenswrapper[4628]: W1211 05:15:24.480417 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-f38a244426a2062ff0a5c18fd709633256b031bddc08c378c0e5ec45a6842a5d WatchSource:0}: Error finding container f38a244426a2062ff0a5c18fd709633256b031bddc08c378c0e5ec45a6842a5d: Status 404 returned error can't find the container with id f38a244426a2062ff0a5c18fd709633256b031bddc08c378c0e5ec45a6842a5d Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.881052 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.881253 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:25.8812222 +0000 UTC m=+28.298568898 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.881404 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.881443 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.881471 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:24 crc kubenswrapper[4628]: I1211 05:15:24.881495 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.881549 4628 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.881608 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:25.88159949 +0000 UTC m=+28.298946188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.881614 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.881632 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.881646 4628 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.881704 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:25.881689563 +0000 UTC m=+28.299036271 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.881730 4628 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.881813 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:25.881793175 +0000 UTC m=+28.299139913 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.882116 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.882132 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.882143 4628 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:24 crc kubenswrapper[4628]: E1211 05:15:24.882191 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:25.882181436 +0000 UTC m=+28.299528144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.077359 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1cf19bed0bbed034b96ccf8f564222d68e9f420cd549fef82ca4c2c28691e1c4"} Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.077450 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7e3335fa28759c64235a640c9019f0b8d03e5b780cca2dfb8bbe44265775495a"} Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.078887 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"01a959b29471ea250c0b29d8d4e9934024b725b1530c5570e62561d572b5badc"} Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.079031 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bd6bc8a48d24e46f07d8eb9bc21b9cbb3c932677e303444391f2f6c61478ab7d"} Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.080810 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.083004 4628 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="86eafba1edb23013c7f70c5182bc61fd6af5e475a6b40b143dbf567b504b8bd1" exitCode=255 Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.083084 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"86eafba1edb23013c7f70c5182bc61fd6af5e475a6b40b143dbf567b504b8bd1"} Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.083980 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f38a244426a2062ff0a5c18fd709633256b031bddc08c378c0e5ec45a6842a5d"} Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.094365 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.104061 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.104525 4628 scope.go:117] "RemoveContainer" containerID="86eafba1edb23013c7f70c5182bc61fd6af5e475a6b40b143dbf567b504b8bd1" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.109748 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.121521 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a959b29471ea250c0b29d8d4e9934024b725b1530c5570e62561d572b5badc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.131908 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.142436 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.158089 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.171015 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.180769 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.193772 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a959b29471ea250c0b29d8d4e9934024b725b1530c5570e62561d572b5badc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.208093 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.224666 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.236978 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.250163 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a98947e-b435-4c9f-8356-537c79cc8b22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86eafba1edb23013c7f70c5182bc61fd6af5e475a6b40b143dbf567b504b8bd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86eafba1edb23013c7f70c5182bc61fd6af5e475a6b40b143dbf567b504b8bd1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\":15:23.782686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 05:15:23.782808 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1211 05:15:23.787047 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1211 05:15:23.787073 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1211 05:15:23.787166 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1211 05:15:23.787169 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1211 05:15:23.787338 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1211 05:15:23.787353 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1211 05:15:23.787649 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3819499919/tls.crt::/tmp/serving-cert-3819499919/tls.key\\\\\\\"\\\\nI1211 05:15:23.787652 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3819499919/tls.crt::/tmp/serving-cert-3819499919/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765430101\\\\\\\\\\\\\\\" (2025-12-11 05:15:00 +0000 UTC to 2026-01-10 05:15:01 +0000 UTC (now=2025-12-11 05:15:23.787597706 +0000 UTC))\\\\\\\"\\\\nI1211 05:15:23.787688 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1211 05:15:23.787731 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nF1211 05:15:23.787788 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T05:14:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T05:14:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T05:14:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.889470 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.889520 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.889569 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.889567 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.889611 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:27.889580562 +0000 UTC m=+30.306927260 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.889632 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.889648 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.889727 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.889740 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.889758 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.889789 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.889826 4628 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.889880 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:27.88986593 +0000 UTC m=+30.307212628 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.889899 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.889935 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.890006 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.890051 4628 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.889952 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.890098 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.890114 4628 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.890118 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:27.890092375 +0000 UTC m=+30.307439073 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.890169 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:27.890153847 +0000 UTC m=+30.307500545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.890221 4628 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 05:15:25 crc kubenswrapper[4628]: E1211 05:15:25.890253 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:27.89024735 +0000 UTC m=+30.307594048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.893567 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.894080 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.894874 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.895455 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.896029 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.896503 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.897072 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.897579 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.898197 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.898697 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.899189 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.899894 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.900418 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.900945 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.901460 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.904640 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.905243 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.905625 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.907272 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.908000 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.908515 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.910299 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.910813 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.911989 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.912384 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.914001 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.914753 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.915689 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.916249 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.916683 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.917526 4628 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.917633 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.919397 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.920305 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.920696 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.922166 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.923237 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.923734 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.924730 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.925456 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.926066 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.927178 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.928202 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.928807 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.929620 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.930152 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.931021 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.931742 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.932637 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.933106 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.933564 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.934425 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.934970 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 11 05:15:25 crc kubenswrapper[4628]: I1211 05:15:25.935790 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.087944 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.089272 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913"} Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.089565 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.090979 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0bcfd78b1c0fb550bb13b82363869b70b6dbc7b8bd9f76b07724a86c81d8375e"} Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.106387 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.120935 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.133353 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.152721 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a959b29471ea250c0b29d8d4e9934024b725b1530c5570e62561d572b5badc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.169019 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.180089 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.199327 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a98947e-b435-4c9f-8356-537c79cc8b22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86eafba1edb23013c7f70c5182bc61fd6af5e475a6b40b143dbf567b504b8bd1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\":15:23.782686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 05:15:23.782808 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1211 05:15:23.787047 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1211 05:15:23.787073 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1211 05:15:23.787166 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1211 05:15:23.787169 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1211 05:15:23.787338 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1211 05:15:23.787353 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1211 05:15:23.787649 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3819499919/tls.crt::/tmp/serving-cert-3819499919/tls.key\\\\\\\"\\\\nI1211 05:15:23.787652 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3819499919/tls.crt::/tmp/serving-cert-3819499919/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765430101\\\\\\\\\\\\\\\" (2025-12-11 05:15:00 +0000 UTC to 2026-01-10 05:15:01 +0000 UTC (now=2025-12-11 05:15:23.787597706 +0000 UTC))\\\\\\\"\\\\nI1211 05:15:23.787688 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1211 05:15:23.787731 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nF1211 05:15:23.787788 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T05:14:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T05:14:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T05:14:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.220785 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.234140 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.247198 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a959b29471ea250c0b29d8d4e9934024b725b1530c5570e62561d572b5badc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.260425 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.273300 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bcfd78b1c0fb550bb13b82363869b70b6dbc7b8bd9f76b07724a86c81d8375e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf19bed0bbed034b96ccf8f564222d68e9f420cd549fef82ca4c2c28691e1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.284261 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:26 crc kubenswrapper[4628]: I1211 05:15:26.295940 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a98947e-b435-4c9f-8356-537c79cc8b22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86eafba1edb23013c7f70c5182bc61fd6af5e475a6b40b143dbf567b504b8bd1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\":15:23.782686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 05:15:23.782808 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1211 05:15:23.787047 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1211 05:15:23.787073 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1211 05:15:23.787166 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1211 05:15:23.787169 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1211 05:15:23.787338 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1211 05:15:23.787353 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1211 05:15:23.787649 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3819499919/tls.crt::/tmp/serving-cert-3819499919/tls.key\\\\\\\"\\\\nI1211 05:15:23.787652 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3819499919/tls.crt::/tmp/serving-cert-3819499919/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765430101\\\\\\\\\\\\\\\" (2025-12-11 05:15:00 +0000 UTC to 2026-01-10 05:15:01 +0000 UTC (now=2025-12-11 05:15:23.787597706 +0000 UTC))\\\\\\\"\\\\nI1211 05:15:23.787688 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1211 05:15:23.787731 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nF1211 05:15:23.787788 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T05:14:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T05:14:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T05:14:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:26Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.094324 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"eca49e754ff21b379e148724c96a5b98dda614d42220b1bce86fd1880d82491c"} Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.113053 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3a98947e-b435-4c9f-8356-537c79cc8b22\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-11T05:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86eafba1edb23013c7f70c5182bc61fd6af5e475a6b40b143dbf567b504b8bd1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\":15:23.782686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1211 05:15:23.782808 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1211 05:15:23.787047 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1211 05:15:23.787073 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1211 05:15:23.787166 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1211 05:15:23.787169 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1211 05:15:23.787338 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1211 05:15:23.787353 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1211 05:15:23.787649 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3819499919/tls.crt::/tmp/serving-cert-3819499919/tls.key\\\\\\\"\\\\nI1211 05:15:23.787652 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-3819499919/tls.crt::/tmp/serving-cert-3819499919/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1765430101\\\\\\\\\\\\\\\" (2025-12-11 05:15:00 +0000 UTC to 2026-01-10 05:15:01 +0000 UTC (now=2025-12-11 05:15:23.787597706 +0000 UTC))\\\\\\\"\\\\nI1211 05:15:23.787688 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1211 05:15:23.787731 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nF1211 05:15:23.787788 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-11T05:14:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-11T05:14:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-11T05:14:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:27Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.128108 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:27Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.144101 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a959b29471ea250c0b29d8d4e9934024b725b1530c5570e62561d572b5badc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:27Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.157495 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:27Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.208607 4628 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-11T05:15:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bcfd78b1c0fb550bb13b82363869b70b6dbc7b8bd9f76b07724a86c81d8375e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cf19bed0bbed034b96ccf8f564222d68e9f420cd549fef82ca4c2c28691e1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-11T05:15:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-11T05:15:27Z is after 2025-08-24T17:21:41Z" Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.888619 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.888683 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.888723 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.888618 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.888871 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.888905 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.906436 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.906542 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.906587 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:31.906566752 +0000 UTC m=+34.323913450 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.906669 4628 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.906707 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:31.906698245 +0000 UTC m=+34.324044943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.906750 4628 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.906787 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:31.906775807 +0000 UTC m=+34.324122505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.906570 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.906995 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.907018 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.907097 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.907110 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.907120 4628 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.907149 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:31.907138957 +0000 UTC m=+34.324485655 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.907194 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.907205 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.907213 4628 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:27 crc kubenswrapper[4628]: E1211 05:15:27.907233 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:31.907227439 +0000 UTC m=+34.324574137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:27 crc kubenswrapper[4628]: I1211 05:15:27.984465 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=2.984441109 podStartE2EDuration="2.984441109s" podCreationTimestamp="2025-12-11 05:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:27.961329223 +0000 UTC m=+30.378675931" watchObservedRunningTime="2025-12-11 05:15:27.984441109 +0000 UTC m=+30.401787827" Dec 11 05:15:29 crc kubenswrapper[4628]: I1211 05:15:29.589543 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:15:29 crc kubenswrapper[4628]: I1211 05:15:29.597284 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:15:29 crc kubenswrapper[4628]: I1211 05:15:29.602106 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 11 05:15:29 crc kubenswrapper[4628]: I1211 05:15:29.889588 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:29 crc kubenswrapper[4628]: I1211 05:15:29.890165 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:29 crc kubenswrapper[4628]: I1211 05:15:29.890420 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:29 crc kubenswrapper[4628]: E1211 05:15:29.891372 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 05:15:29 crc kubenswrapper[4628]: E1211 05:15:29.890806 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 05:15:29 crc kubenswrapper[4628]: E1211 05:15:29.890632 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.160867 4628 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.162834 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.162966 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.162989 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.163127 4628 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.172127 4628 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.172675 4628 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.174694 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.174761 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.174774 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.174794 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.174807 4628 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-11T05:15:30Z","lastTransitionTime":"2025-12-11T05:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.660562 4628 csr.go:261] certificate signing request csr-bqdnr is approved, waiting to be issued Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.683319 4628 csr.go:257] certificate signing request csr-bqdnr is issued Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.907463 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jkp5h"] Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.907755 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jkp5h" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.913014 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.913119 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.913244 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.914778 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.936896 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9btrm"] Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.937314 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9btrm" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.940530 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.941101 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.941101 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 11 05:15:30 crc kubenswrapper[4628]: I1211 05:15:30.984530 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.984497616 podStartE2EDuration="1.984497616s" podCreationTimestamp="2025-12-11 05:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:30.950476739 +0000 UTC m=+33.367823447" watchObservedRunningTime="2025-12-11 05:15:30.984497616 +0000 UTC m=+33.401844314" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.019379 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f"] Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.019823 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.024676 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.024672 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.024680 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.025229 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.037396 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43aec01e-0431-462f-b907-69ef727f84ab-serviceca\") pod \"node-ca-jkp5h\" (UID: \"43aec01e-0431-462f-b907-69ef727f84ab\") " pod="openshift-image-registry/node-ca-jkp5h" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.037440 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wqg6\" (UniqueName: \"kubernetes.io/projected/dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5-kube-api-access-4wqg6\") pod \"node-resolver-9btrm\" (UID: \"dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5\") " pod="openshift-dns/node-resolver-9btrm" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.037466 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvlx2\" (UniqueName: \"kubernetes.io/projected/43aec01e-0431-462f-b907-69ef727f84ab-kube-api-access-lvlx2\") pod \"node-ca-jkp5h\" (UID: \"43aec01e-0431-462f-b907-69ef727f84ab\") " pod="openshift-image-registry/node-ca-jkp5h" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.037519 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43aec01e-0431-462f-b907-69ef727f84ab-host\") pod \"node-ca-jkp5h\" (UID: \"43aec01e-0431-462f-b907-69ef727f84ab\") " pod="openshift-image-registry/node-ca-jkp5h" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.037534 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5-hosts-file\") pod \"node-resolver-9btrm\" (UID: \"dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5\") " pod="openshift-dns/node-resolver-9btrm" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.071278 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-m7bbt"] Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.071747 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.076051 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.077766 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.078165 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.078472 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.078596 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.099958 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-hvwvx"] Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.100681 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jbvg4"] Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.101984 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.102380 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.108651 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.108827 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.109021 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.109266 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.109588 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.111135 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.111284 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.138314 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/38344908-500a-46d6-87dd-a4160d445b93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.138395 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38344908-500a-46d6-87dd-a4160d445b93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.138429 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43aec01e-0431-462f-b907-69ef727f84ab-host\") pod \"node-ca-jkp5h\" (UID: \"43aec01e-0431-462f-b907-69ef727f84ab\") " pod="openshift-image-registry/node-ca-jkp5h" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.138450 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5-hosts-file\") pod \"node-resolver-9btrm\" (UID: \"dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5\") " pod="openshift-dns/node-resolver-9btrm" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.138473 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38344908-500a-46d6-87dd-a4160d445b93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.138491 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/38344908-500a-46d6-87dd-a4160d445b93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.138509 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38344908-500a-46d6-87dd-a4160d445b93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.138531 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43aec01e-0431-462f-b907-69ef727f84ab-serviceca\") pod \"node-ca-jkp5h\" (UID: \"43aec01e-0431-462f-b907-69ef727f84ab\") " pod="openshift-image-registry/node-ca-jkp5h" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.138555 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wqg6\" (UniqueName: \"kubernetes.io/projected/dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5-kube-api-access-4wqg6\") pod \"node-resolver-9btrm\" (UID: \"dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5\") " pod="openshift-dns/node-resolver-9btrm" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.138579 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvlx2\" (UniqueName: \"kubernetes.io/projected/43aec01e-0431-462f-b907-69ef727f84ab-kube-api-access-lvlx2\") pod \"node-ca-jkp5h\" (UID: \"43aec01e-0431-462f-b907-69ef727f84ab\") " pod="openshift-image-registry/node-ca-jkp5h" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.138856 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43aec01e-0431-462f-b907-69ef727f84ab-host\") pod \"node-ca-jkp5h\" (UID: \"43aec01e-0431-462f-b907-69ef727f84ab\") " pod="openshift-image-registry/node-ca-jkp5h" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.138927 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5-hosts-file\") pod \"node-resolver-9btrm\" (UID: \"dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5\") " pod="openshift-dns/node-resolver-9btrm" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.139719 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43aec01e-0431-462f-b907-69ef727f84ab-serviceca\") pod \"node-ca-jkp5h\" (UID: \"43aec01e-0431-462f-b907-69ef727f84ab\") " pod="openshift-image-registry/node-ca-jkp5h" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.153784 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-rfts6"] Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.154322 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.154394 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfts6" podUID="68d7972a-8fde-4878-a758-99ed42b3e4c5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.160284 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wqg6\" (UniqueName: \"kubernetes.io/projected/dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5-kube-api-access-4wqg6\") pod \"node-resolver-9btrm\" (UID: \"dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5\") " pod="openshift-dns/node-resolver-9btrm" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.162876 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvlx2\" (UniqueName: \"kubernetes.io/projected/43aec01e-0431-462f-b907-69ef727f84ab-kube-api-access-lvlx2\") pod \"node-ca-jkp5h\" (UID: \"43aec01e-0431-462f-b907-69ef727f84ab\") " pod="openshift-image-registry/node-ca-jkp5h" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.204678 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r7545"] Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.205751 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.211061 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.212547 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.212859 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.213180 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.213376 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.215855 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.216112 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.219294 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jkp5h" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239229 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-run-k8s-cni-cncf-io\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239268 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-run-netns\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239288 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-var-lib-cni-multus\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239305 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-run-multus-certs\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239330 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2cbe69b9-c210-427d-9807-bf7cf7a70e3a-rootfs\") pod \"machine-config-daemon-hvwvx\" (UID: \"2cbe69b9-c210-427d-9807-bf7cf7a70e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239347 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs\") pod \"network-metrics-daemon-rfts6\" (UID: \"68d7972a-8fde-4878-a758-99ed42b3e4c5\") " pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239373 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/38344908-500a-46d6-87dd-a4160d445b93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239398 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-os-release\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239438 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/12120ff3-d5ea-4737-924f-49f9c0c347b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239462 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-multus-socket-dir-parent\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239490 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38344908-500a-46d6-87dd-a4160d445b93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239510 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8lhb\" (UniqueName: \"kubernetes.io/projected/68d7972a-8fde-4878-a758-99ed42b3e4c5-kube-api-access-s8lhb\") pod \"network-metrics-daemon-rfts6\" (UID: \"68d7972a-8fde-4878-a758-99ed42b3e4c5\") " pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239528 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-multus-conf-dir\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239553 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/12120ff3-d5ea-4737-924f-49f9c0c347b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239573 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2cbe69b9-c210-427d-9807-bf7cf7a70e3a-mcd-auth-proxy-config\") pod \"machine-config-daemon-hvwvx\" (UID: \"2cbe69b9-c210-427d-9807-bf7cf7a70e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239591 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-var-lib-kubelet\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239614 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zj7b\" (UniqueName: \"kubernetes.io/projected/12120ff3-d5ea-4737-924f-49f9c0c347b1-kube-api-access-6zj7b\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239633 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjvf9\" (UniqueName: \"kubernetes.io/projected/2cbe69b9-c210-427d-9807-bf7cf7a70e3a-kube-api-access-kjvf9\") pod \"machine-config-daemon-hvwvx\" (UID: \"2cbe69b9-c210-427d-9807-bf7cf7a70e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239652 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-var-lib-cni-bin\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239669 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12120ff3-d5ea-4737-924f-49f9c0c347b1-system-cni-dir\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239684 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cbe69b9-c210-427d-9807-bf7cf7a70e3a-proxy-tls\") pod \"machine-config-daemon-hvwvx\" (UID: \"2cbe69b9-c210-427d-9807-bf7cf7a70e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239702 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-multus-cni-dir\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239716 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-cnibin\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239729 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-hostroot\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239744 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-857dd\" (UniqueName: \"kubernetes.io/projected/db022de3-87d1-493a-a77d-39d56bd83c22-kube-api-access-857dd\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239767 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-system-cni-dir\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239782 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db022de3-87d1-493a-a77d-39d56bd83c22-cni-binary-copy\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239819 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38344908-500a-46d6-87dd-a4160d445b93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239899 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/12120ff3-d5ea-4737-924f-49f9c0c347b1-cnibin\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239917 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38344908-500a-46d6-87dd-a4160d445b93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239933 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/12120ff3-d5ea-4737-924f-49f9c0c347b1-os-release\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239952 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/12120ff3-d5ea-4737-924f-49f9c0c347b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239976 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db022de3-87d1-493a-a77d-39d56bd83c22-multus-daemon-config\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.239997 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-etc-kubernetes\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.240016 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/38344908-500a-46d6-87dd-a4160d445b93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.240091 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/38344908-500a-46d6-87dd-a4160d445b93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.240193 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/38344908-500a-46d6-87dd-a4160d445b93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.247495 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38344908-500a-46d6-87dd-a4160d445b93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.249737 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9btrm" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.250787 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38344908-500a-46d6-87dd-a4160d445b93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.263818 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38344908-500a-46d6-87dd-a4160d445b93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lnc9f\" (UID: \"38344908-500a-46d6-87dd-a4160d445b93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.332779 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341319 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-os-release\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341390 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/12120ff3-d5ea-4737-924f-49f9c0c347b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341414 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-multus-socket-dir-parent\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341437 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-systemd-units\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341472 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8lhb\" (UniqueName: \"kubernetes.io/projected/68d7972a-8fde-4878-a758-99ed42b3e4c5-kube-api-access-s8lhb\") pod \"network-metrics-daemon-rfts6\" (UID: \"68d7972a-8fde-4878-a758-99ed42b3e4c5\") " pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341487 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovnkube-config\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341504 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-multus-conf-dir\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341538 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-cni-bin\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341557 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/12120ff3-d5ea-4737-924f-49f9c0c347b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341572 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2cbe69b9-c210-427d-9807-bf7cf7a70e3a-mcd-auth-proxy-config\") pod \"machine-config-daemon-hvwvx\" (UID: \"2cbe69b9-c210-427d-9807-bf7cf7a70e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341589 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-var-lib-kubelet\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341623 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-slash\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341641 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341659 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zj7b\" (UniqueName: \"kubernetes.io/projected/12120ff3-d5ea-4737-924f-49f9c0c347b1-kube-api-access-6zj7b\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341689 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjvf9\" (UniqueName: \"kubernetes.io/projected/2cbe69b9-c210-427d-9807-bf7cf7a70e3a-kube-api-access-kjvf9\") pod \"machine-config-daemon-hvwvx\" (UID: \"2cbe69b9-c210-427d-9807-bf7cf7a70e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341704 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-var-lib-cni-bin\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341718 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-kubelet\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341733 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-systemd\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341747 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-cni-netd\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341780 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12120ff3-d5ea-4737-924f-49f9c0c347b1-system-cni-dir\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341794 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cbe69b9-c210-427d-9807-bf7cf7a70e3a-proxy-tls\") pod \"machine-config-daemon-hvwvx\" (UID: \"2cbe69b9-c210-427d-9807-bf7cf7a70e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341809 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-multus-cni-dir\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341823 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-cnibin\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341869 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-hostroot\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341886 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-857dd\" (UniqueName: \"kubernetes.io/projected/db022de3-87d1-493a-a77d-39d56bd83c22-kube-api-access-857dd\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341901 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-node-log\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341942 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-system-cni-dir\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341958 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-openvswitch\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341972 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovnkube-script-lib\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.341993 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db022de3-87d1-493a-a77d-39d56bd83c22-cni-binary-copy\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342025 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-run-netns\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342040 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-env-overrides\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342054 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovn-node-metrics-cert\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342098 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/12120ff3-d5ea-4737-924f-49f9c0c347b1-cnibin\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342118 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-var-lib-openvswitch\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342132 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-log-socket\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342147 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/12120ff3-d5ea-4737-924f-49f9c0c347b1-os-release\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342178 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/12120ff3-d5ea-4737-924f-49f9c0c347b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342193 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db022de3-87d1-493a-a77d-39d56bd83c22-multus-daemon-config\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342207 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-etc-kubernetes\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342259 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzvn7\" (UniqueName: \"kubernetes.io/projected/0904ad55-afbb-42a5-82e9-1f68c8d50a84-kube-api-access-pzvn7\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342276 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-etc-openvswitch\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342292 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-ovn\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342309 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-run-k8s-cni-cncf-io\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342347 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-run-netns\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342367 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-var-lib-cni-multus\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342384 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-run-multus-certs\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342422 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2cbe69b9-c210-427d-9807-bf7cf7a70e3a-rootfs\") pod \"machine-config-daemon-hvwvx\" (UID: \"2cbe69b9-c210-427d-9807-bf7cf7a70e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342439 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs\") pod \"network-metrics-daemon-rfts6\" (UID: \"68d7972a-8fde-4878-a758-99ed42b3e4c5\") " pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.342455 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-run-ovn-kubernetes\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.344369 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/12120ff3-d5ea-4737-924f-49f9c0c347b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.344462 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-multus-socket-dir-parent\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.344658 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-multus-conf-dir\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.344954 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/12120ff3-d5ea-4737-924f-49f9c0c347b1-os-release\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.345081 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-system-cni-dir\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.345144 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/12120ff3-d5ea-4737-924f-49f9c0c347b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.345196 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-os-release\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.345382 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-var-lib-kubelet\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.345701 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2cbe69b9-c210-427d-9807-bf7cf7a70e3a-mcd-auth-proxy-config\") pod \"machine-config-daemon-hvwvx\" (UID: \"2cbe69b9-c210-427d-9807-bf7cf7a70e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.345716 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-var-lib-cni-bin\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.345705 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db022de3-87d1-493a-a77d-39d56bd83c22-cni-binary-copy\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.345787 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-hostroot\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.345787 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12120ff3-d5ea-4737-924f-49f9c0c347b1-system-cni-dir\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.345733 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/12120ff3-d5ea-4737-924f-49f9c0c347b1-cnibin\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.345803 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-multus-cni-dir\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.345831 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-cnibin\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.346055 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-var-lib-cni-multus\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.346267 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-run-netns\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.346271 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-run-k8s-cni-cncf-io\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.346291 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-host-run-multus-certs\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.346363 4628 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.346606 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/12120ff3-d5ea-4737-924f-49f9c0c347b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.346404 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2cbe69b9-c210-427d-9807-bf7cf7a70e3a-rootfs\") pod \"machine-config-daemon-hvwvx\" (UID: \"2cbe69b9-c210-427d-9807-bf7cf7a70e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.346532 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db022de3-87d1-493a-a77d-39d56bd83c22-multus-daemon-config\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.346385 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db022de3-87d1-493a-a77d-39d56bd83c22-etc-kubernetes\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.346836 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs podName:68d7972a-8fde-4878-a758-99ed42b3e4c5 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:31.846808558 +0000 UTC m=+34.264155476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs") pod "network-metrics-daemon-rfts6" (UID: "68d7972a-8fde-4878-a758-99ed42b3e4c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.348790 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cbe69b9-c210-427d-9807-bf7cf7a70e3a-proxy-tls\") pod \"machine-config-daemon-hvwvx\" (UID: \"2cbe69b9-c210-427d-9807-bf7cf7a70e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.368413 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zj7b\" (UniqueName: \"kubernetes.io/projected/12120ff3-d5ea-4737-924f-49f9c0c347b1-kube-api-access-6zj7b\") pod \"multus-additional-cni-plugins-jbvg4\" (UID: \"12120ff3-d5ea-4737-924f-49f9c0c347b1\") " pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.372388 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8lhb\" (UniqueName: \"kubernetes.io/projected/68d7972a-8fde-4878-a758-99ed42b3e4c5-kube-api-access-s8lhb\") pod \"network-metrics-daemon-rfts6\" (UID: \"68d7972a-8fde-4878-a758-99ed42b3e4c5\") " pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.373147 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjvf9\" (UniqueName: \"kubernetes.io/projected/2cbe69b9-c210-427d-9807-bf7cf7a70e3a-kube-api-access-kjvf9\") pod \"machine-config-daemon-hvwvx\" (UID: \"2cbe69b9-c210-427d-9807-bf7cf7a70e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.375817 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-857dd\" (UniqueName: \"kubernetes.io/projected/db022de3-87d1-493a-a77d-39d56bd83c22-kube-api-access-857dd\") pod \"multus-m7bbt\" (UID: \"db022de3-87d1-493a-a77d-39d56bd83c22\") " pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.383928 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-m7bbt" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.419994 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jbvg4" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.425931 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447281 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-node-log\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447337 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovnkube-script-lib\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447376 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-openvswitch\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447415 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-run-netns\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447436 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-env-overrides\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447458 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovn-node-metrics-cert\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447484 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-var-lib-openvswitch\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447506 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-log-socket\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447524 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzvn7\" (UniqueName: \"kubernetes.io/projected/0904ad55-afbb-42a5-82e9-1f68c8d50a84-kube-api-access-pzvn7\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447554 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-etc-openvswitch\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447578 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-ovn\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447620 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-run-ovn-kubernetes\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447673 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-systemd-units\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447698 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovnkube-config\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447728 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-cni-bin\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447751 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-slash\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447776 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447822 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-kubelet\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447860 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-systemd\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447885 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-cni-netd\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.447961 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-cni-netd\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.448014 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-node-log\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.448573 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovnkube-script-lib\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.448617 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-openvswitch\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.448653 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-run-ovn-kubernetes\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.448683 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-systemd-units\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.448674 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-ovn\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.448715 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-run-netns\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.449150 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovnkube-config\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.449201 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-cni-bin\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.449231 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-slash\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.449258 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.449286 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-kubelet\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.449316 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-systemd\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.449345 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-log-socket\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.449521 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-env-overrides\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.449923 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-etc-openvswitch\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.449995 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-var-lib-openvswitch\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.469470 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovn-node-metrics-cert\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.478911 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzvn7\" (UniqueName: \"kubernetes.io/projected/0904ad55-afbb-42a5-82e9-1f68c8d50a84-kube-api-access-pzvn7\") pod \"ovnkube-node-r7545\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.530140 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.589161 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5"] Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.589592 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.594707 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.597117 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.684667 4628 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-11 05:10:30 +0000 UTC, rotation deadline is 2026-09-18 09:46:48.613794634 +0000 UTC Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.684749 4628 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6748h31m16.929048034s for next certificate rotation Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.753417 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f464c13-086c-44d0-8d15-65859b8322e7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qczt5\" (UID: \"6f464c13-086c-44d0-8d15-65859b8322e7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.753496 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f464c13-086c-44d0-8d15-65859b8322e7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qczt5\" (UID: \"6f464c13-086c-44d0-8d15-65859b8322e7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.753516 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f464c13-086c-44d0-8d15-65859b8322e7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qczt5\" (UID: \"6f464c13-086c-44d0-8d15-65859b8322e7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.753536 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4hql\" (UniqueName: \"kubernetes.io/projected/6f464c13-086c-44d0-8d15-65859b8322e7-kube-api-access-s4hql\") pod \"ovnkube-control-plane-749d76644c-qczt5\" (UID: \"6f464c13-086c-44d0-8d15-65859b8322e7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.855260 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f464c13-086c-44d0-8d15-65859b8322e7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qczt5\" (UID: \"6f464c13-086c-44d0-8d15-65859b8322e7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.855372 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs\") pod \"network-metrics-daemon-rfts6\" (UID: \"68d7972a-8fde-4878-a758-99ed42b3e4c5\") " pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.855435 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f464c13-086c-44d0-8d15-65859b8322e7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qczt5\" (UID: \"6f464c13-086c-44d0-8d15-65859b8322e7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.855460 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f464c13-086c-44d0-8d15-65859b8322e7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qczt5\" (UID: \"6f464c13-086c-44d0-8d15-65859b8322e7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.855497 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4hql\" (UniqueName: \"kubernetes.io/projected/6f464c13-086c-44d0-8d15-65859b8322e7-kube-api-access-s4hql\") pod \"ovnkube-control-plane-749d76644c-qczt5\" (UID: \"6f464c13-086c-44d0-8d15-65859b8322e7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.855699 4628 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.855795 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs podName:68d7972a-8fde-4878-a758-99ed42b3e4c5 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:32.855774871 +0000 UTC m=+35.273121569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs") pod "network-metrics-daemon-rfts6" (UID: "68d7972a-8fde-4878-a758-99ed42b3e4c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.856155 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f464c13-086c-44d0-8d15-65859b8322e7-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qczt5\" (UID: \"6f464c13-086c-44d0-8d15-65859b8322e7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.857017 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f464c13-086c-44d0-8d15-65859b8322e7-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qczt5\" (UID: \"6f464c13-086c-44d0-8d15-65859b8322e7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.858861 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f464c13-086c-44d0-8d15-65859b8322e7-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qczt5\" (UID: \"6f464c13-086c-44d0-8d15-65859b8322e7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.874935 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4hql\" (UniqueName: \"kubernetes.io/projected/6f464c13-086c-44d0-8d15-65859b8322e7-kube-api-access-s4hql\") pod \"ovnkube-control-plane-749d76644c-qczt5\" (UID: \"6f464c13-086c-44d0-8d15-65859b8322e7\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.889278 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.889372 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.889462 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.889553 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.889768 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.889895 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.907345 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.956640 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.956892 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957041 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:39.956933009 +0000 UTC m=+42.374279747 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957055 4628 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.957145 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957203 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:39.957173265 +0000 UTC m=+42.374519963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.957300 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:31 crc kubenswrapper[4628]: I1211 05:15:31.957348 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957407 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957467 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957509 4628 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957574 4628 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957585 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:39.957560985 +0000 UTC m=+42.374907713 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957604 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:39.957596896 +0000 UTC m=+42.374943594 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957666 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957695 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957719 4628 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:31 crc kubenswrapper[4628]: E1211 05:15:31.957794 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:39.957778441 +0000 UTC m=+42.375125169 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:32 crc kubenswrapper[4628]: I1211 05:15:32.116417 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9btrm" event={"ID":"dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5","Type":"ContainerStarted","Data":"cf53a854b64a015787920d277c31fd41881cdcd6d3c3fbf11d4a4682dcb8fd4a"} Dec 11 05:15:32 crc kubenswrapper[4628]: I1211 05:15:32.119183 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" event={"ID":"38344908-500a-46d6-87dd-a4160d445b93","Type":"ContainerStarted","Data":"8c692fb6003990700820be61903c1512e0e1565290dc859fafec6f6a1ee49b9b"} Dec 11 05:15:32 crc kubenswrapper[4628]: I1211 05:15:32.120255 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jkp5h" event={"ID":"43aec01e-0431-462f-b907-69ef727f84ab","Type":"ContainerStarted","Data":"527a1e5c0ed8d34530fa5a545baed4dc3c0640d4554b249f076a4113d0bbedf9"} Dec 11 05:15:32 crc kubenswrapper[4628]: I1211 05:15:32.121568 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"a93a3962f10e3f61dbf059320f7e696e416d0a37b48e63e437f20a83f5a70b3c"} Dec 11 05:15:32 crc kubenswrapper[4628]: I1211 05:15:32.122794 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m7bbt" event={"ID":"db022de3-87d1-493a-a77d-39d56bd83c22","Type":"ContainerStarted","Data":"08858e828186ececb26dfb7489f52553d6b4a3227bc65cbadd6fe5f4feea2d5c"} Dec 11 05:15:32 crc kubenswrapper[4628]: I1211 05:15:32.124106 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbvg4" event={"ID":"12120ff3-d5ea-4737-924f-49f9c0c347b1","Type":"ContainerStarted","Data":"0892dc4667787a3f41d2d68252fba1a17f5fe7b08c56d5fbc0fc0a859a5c02b4"} Dec 11 05:15:32 crc kubenswrapper[4628]: W1211 05:15:32.398428 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f464c13_086c_44d0_8d15_65859b8322e7.slice/crio-da2a0d22c0e2b74c31ccab56cfcb8e2bd639dd80e5d523240981ee73bfdf7c90 WatchSource:0}: Error finding container da2a0d22c0e2b74c31ccab56cfcb8e2bd639dd80e5d523240981ee73bfdf7c90: Status 404 returned error can't find the container with id da2a0d22c0e2b74c31ccab56cfcb8e2bd639dd80e5d523240981ee73bfdf7c90 Dec 11 05:15:32 crc kubenswrapper[4628]: I1211 05:15:32.867162 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs\") pod \"network-metrics-daemon-rfts6\" (UID: \"68d7972a-8fde-4878-a758-99ed42b3e4c5\") " pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:32 crc kubenswrapper[4628]: E1211 05:15:32.867456 4628 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 05:15:32 crc kubenswrapper[4628]: E1211 05:15:32.867600 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs podName:68d7972a-8fde-4878-a758-99ed42b3e4c5 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:34.867566234 +0000 UTC m=+37.284913082 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs") pod "network-metrics-daemon-rfts6" (UID: "68d7972a-8fde-4878-a758-99ed42b3e4c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 05:15:32 crc kubenswrapper[4628]: I1211 05:15:32.888886 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:32 crc kubenswrapper[4628]: E1211 05:15:32.889020 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfts6" podUID="68d7972a-8fde-4878-a758-99ed42b3e4c5" Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.132464 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"9ec1f2cc3a03e6f2a65f3f373be037e9c92627dd9b340b4986e93bb6d9cf7c4f"} Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.132507 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"1ceb30e4a3d9e4f8a0cb5dd8e8ae33f28f9c75bc4c4706b76660db8785b07748"} Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.134912 4628 generic.go:334] "Generic (PLEG): container finished" podID="12120ff3-d5ea-4737-924f-49f9c0c347b1" containerID="b895da11938235c4d7832d1575d0e383604296466bddedac3929eca62e411edb" exitCode=0 Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.135588 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbvg4" event={"ID":"12120ff3-d5ea-4737-924f-49f9c0c347b1","Type":"ContainerDied","Data":"b895da11938235c4d7832d1575d0e383604296466bddedac3929eca62e411edb"} Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.138467 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9btrm" event={"ID":"dc9a9216-ecbe-43cb-b339-8cfd9b0a25c5","Type":"ContainerStarted","Data":"0340100139e0c83e3730ff069b2320a8a1dfaaff1ab9405f5a1d583fe6cbf20a"} Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.139725 4628 generic.go:334] "Generic (PLEG): container finished" podID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerID="827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767" exitCode=0 Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.139752 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerDied","Data":"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767"} Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.139788 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerStarted","Data":"1141279842bfb3efe7f24ded16aaefd3db8a03eb6025a837f295bf8655f2e1a1"} Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.141020 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" event={"ID":"38344908-500a-46d6-87dd-a4160d445b93","Type":"ContainerStarted","Data":"914cbfe0f7d7e29dea686bbbe8696f0a6d7cf17f6a2d42b9f9b56c194396de18"} Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.146274 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jkp5h" event={"ID":"43aec01e-0431-462f-b907-69ef727f84ab","Type":"ContainerStarted","Data":"7989f7f0c94532ebddcf7430c4fa4466d9a3dffbe4216ce45e9e8ac4d5a9f9d0"} Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.151427 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" event={"ID":"6f464c13-086c-44d0-8d15-65859b8322e7","Type":"ContainerStarted","Data":"e623aa6e49cbb436a3a60d618ab513fc6d0f84e3b94a54e417cc6ff598a3c8b9"} Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.151465 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" event={"ID":"6f464c13-086c-44d0-8d15-65859b8322e7","Type":"ContainerStarted","Data":"5b6d1f0b26527dd8c5a230acf7ac4120f4a25e3d0c4512dbe844fe0530dc52ce"} Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.151475 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" event={"ID":"6f464c13-086c-44d0-8d15-65859b8322e7","Type":"ContainerStarted","Data":"da2a0d22c0e2b74c31ccab56cfcb8e2bd639dd80e5d523240981ee73bfdf7c90"} Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.154442 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m7bbt" event={"ID":"db022de3-87d1-493a-a77d-39d56bd83c22","Type":"ContainerStarted","Data":"55eb6510038209e1df420ab43dba3056f51d24eb63d8e27fad4be5844758bec6"} Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.163662 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podStartSLOduration=3.16364584 podStartE2EDuration="3.16364584s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:33.160358212 +0000 UTC m=+35.577704920" watchObservedRunningTime="2025-12-11 05:15:33.16364584 +0000 UTC m=+35.580992538" Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.228006 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-m7bbt" podStartSLOduration=2.227986396 podStartE2EDuration="2.227986396s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:33.190353942 +0000 UTC m=+35.607700650" watchObservedRunningTime="2025-12-11 05:15:33.227986396 +0000 UTC m=+35.645333094" Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.251542 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9btrm" podStartSLOduration=3.251528304 podStartE2EDuration="3.251528304s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:33.249595632 +0000 UTC m=+35.666942330" watchObservedRunningTime="2025-12-11 05:15:33.251528304 +0000 UTC m=+35.668875002" Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.268147 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jkp5h" podStartSLOduration=3.268127717 podStartE2EDuration="3.268127717s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:33.265724702 +0000 UTC m=+35.683071400" watchObservedRunningTime="2025-12-11 05:15:33.268127717 +0000 UTC m=+35.685474415" Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.289982 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lnc9f" podStartSLOduration=3.289959939 podStartE2EDuration="3.289959939s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:33.28928636 +0000 UTC m=+35.706633058" watchObservedRunningTime="2025-12-11 05:15:33.289959939 +0000 UTC m=+35.707306637" Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.889701 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:33 crc kubenswrapper[4628]: E1211 05:15:33.890183 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.890671 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:33 crc kubenswrapper[4628]: E1211 05:15:33.890766 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 05:15:33 crc kubenswrapper[4628]: I1211 05:15:33.890815 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:33 crc kubenswrapper[4628]: E1211 05:15:33.890870 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 05:15:34 crc kubenswrapper[4628]: I1211 05:15:34.170802 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerStarted","Data":"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa"} Dec 11 05:15:34 crc kubenswrapper[4628]: I1211 05:15:34.171175 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerStarted","Data":"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853"} Dec 11 05:15:34 crc kubenswrapper[4628]: I1211 05:15:34.171189 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerStarted","Data":"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f"} Dec 11 05:15:34 crc kubenswrapper[4628]: I1211 05:15:34.171201 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerStarted","Data":"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59"} Dec 11 05:15:34 crc kubenswrapper[4628]: I1211 05:15:34.171215 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerStarted","Data":"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d"} Dec 11 05:15:34 crc kubenswrapper[4628]: I1211 05:15:34.171226 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerStarted","Data":"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d"} Dec 11 05:15:34 crc kubenswrapper[4628]: I1211 05:15:34.174290 4628 generic.go:334] "Generic (PLEG): container finished" podID="12120ff3-d5ea-4737-924f-49f9c0c347b1" containerID="e866867a123dfd2345ccd0da46b2e2744c9bc9fd542a9be613128e2490c0bae8" exitCode=0 Dec 11 05:15:34 crc kubenswrapper[4628]: I1211 05:15:34.174435 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbvg4" event={"ID":"12120ff3-d5ea-4737-924f-49f9c0c347b1","Type":"ContainerDied","Data":"e866867a123dfd2345ccd0da46b2e2744c9bc9fd542a9be613128e2490c0bae8"} Dec 11 05:15:34 crc kubenswrapper[4628]: I1211 05:15:34.193074 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qczt5" podStartSLOduration=3.193054953 podStartE2EDuration="3.193054953s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:33.329336289 +0000 UTC m=+35.746682977" watchObservedRunningTime="2025-12-11 05:15:34.193054953 +0000 UTC m=+36.610401651" Dec 11 05:15:34 crc kubenswrapper[4628]: I1211 05:15:34.888537 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:34 crc kubenswrapper[4628]: E1211 05:15:34.888677 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfts6" podUID="68d7972a-8fde-4878-a758-99ed42b3e4c5" Dec 11 05:15:34 crc kubenswrapper[4628]: I1211 05:15:34.891982 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs\") pod \"network-metrics-daemon-rfts6\" (UID: \"68d7972a-8fde-4878-a758-99ed42b3e4c5\") " pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:34 crc kubenswrapper[4628]: E1211 05:15:34.892131 4628 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 05:15:34 crc kubenswrapper[4628]: E1211 05:15:34.892177 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs podName:68d7972a-8fde-4878-a758-99ed42b3e4c5 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:38.892161147 +0000 UTC m=+41.309507845 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs") pod "network-metrics-daemon-rfts6" (UID: "68d7972a-8fde-4878-a758-99ed42b3e4c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 05:15:35 crc kubenswrapper[4628]: I1211 05:15:35.181168 4628 generic.go:334] "Generic (PLEG): container finished" podID="12120ff3-d5ea-4737-924f-49f9c0c347b1" containerID="5caaf7ef568014f1f80ccad526efb3376c298452a0f79cad85b5ec0bd7938952" exitCode=0 Dec 11 05:15:35 crc kubenswrapper[4628]: I1211 05:15:35.181211 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbvg4" event={"ID":"12120ff3-d5ea-4737-924f-49f9c0c347b1","Type":"ContainerDied","Data":"5caaf7ef568014f1f80ccad526efb3376c298452a0f79cad85b5ec0bd7938952"} Dec 11 05:15:35 crc kubenswrapper[4628]: I1211 05:15:35.620628 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:15:35 crc kubenswrapper[4628]: I1211 05:15:35.888658 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:35 crc kubenswrapper[4628]: I1211 05:15:35.888668 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:35 crc kubenswrapper[4628]: I1211 05:15:35.888689 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:35 crc kubenswrapper[4628]: E1211 05:15:35.889096 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 05:15:35 crc kubenswrapper[4628]: E1211 05:15:35.888876 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 05:15:35 crc kubenswrapper[4628]: E1211 05:15:35.889088 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 05:15:36 crc kubenswrapper[4628]: I1211 05:15:36.187990 4628 generic.go:334] "Generic (PLEG): container finished" podID="12120ff3-d5ea-4737-924f-49f9c0c347b1" containerID="9b8f3ddce9d6d627b63f6bbcfde75241c6b505441b903011e7448c91cbca1bc1" exitCode=0 Dec 11 05:15:36 crc kubenswrapper[4628]: I1211 05:15:36.188061 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbvg4" event={"ID":"12120ff3-d5ea-4737-924f-49f9c0c347b1","Type":"ContainerDied","Data":"9b8f3ddce9d6d627b63f6bbcfde75241c6b505441b903011e7448c91cbca1bc1"} Dec 11 05:15:36 crc kubenswrapper[4628]: I1211 05:15:36.889422 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:36 crc kubenswrapper[4628]: E1211 05:15:36.889898 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfts6" podUID="68d7972a-8fde-4878-a758-99ed42b3e4c5" Dec 11 05:15:37 crc kubenswrapper[4628]: I1211 05:15:37.201282 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerStarted","Data":"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84"} Dec 11 05:15:37 crc kubenswrapper[4628]: I1211 05:15:37.208996 4628 generic.go:334] "Generic (PLEG): container finished" podID="12120ff3-d5ea-4737-924f-49f9c0c347b1" containerID="50fc26e40d55345a8dbc6fdfd4b759f4eb5245898b7faa2762f0b00a35dacb8c" exitCode=0 Dec 11 05:15:37 crc kubenswrapper[4628]: I1211 05:15:37.209055 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbvg4" event={"ID":"12120ff3-d5ea-4737-924f-49f9c0c347b1","Type":"ContainerDied","Data":"50fc26e40d55345a8dbc6fdfd4b759f4eb5245898b7faa2762f0b00a35dacb8c"} Dec 11 05:15:37 crc kubenswrapper[4628]: I1211 05:15:37.749498 4628 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 11 05:15:37 crc kubenswrapper[4628]: I1211 05:15:37.890041 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:37 crc kubenswrapper[4628]: E1211 05:15:37.894066 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 05:15:37 crc kubenswrapper[4628]: I1211 05:15:37.894109 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:37 crc kubenswrapper[4628]: E1211 05:15:37.894247 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 05:15:37 crc kubenswrapper[4628]: I1211 05:15:37.894410 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:37 crc kubenswrapper[4628]: E1211 05:15:37.894623 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 05:15:38 crc kubenswrapper[4628]: I1211 05:15:38.218305 4628 generic.go:334] "Generic (PLEG): container finished" podID="12120ff3-d5ea-4737-924f-49f9c0c347b1" containerID="d932bfd8ba8870641c5864fc83e39bb25e9ac5406232e94aa36ebb251f00c757" exitCode=0 Dec 11 05:15:38 crc kubenswrapper[4628]: I1211 05:15:38.218350 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbvg4" event={"ID":"12120ff3-d5ea-4737-924f-49f9c0c347b1","Type":"ContainerDied","Data":"d932bfd8ba8870641c5864fc83e39bb25e9ac5406232e94aa36ebb251f00c757"} Dec 11 05:15:38 crc kubenswrapper[4628]: I1211 05:15:38.889276 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:38 crc kubenswrapper[4628]: E1211 05:15:38.889639 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfts6" podUID="68d7972a-8fde-4878-a758-99ed42b3e4c5" Dec 11 05:15:38 crc kubenswrapper[4628]: I1211 05:15:38.961612 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs\") pod \"network-metrics-daemon-rfts6\" (UID: \"68d7972a-8fde-4878-a758-99ed42b3e4c5\") " pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:38 crc kubenswrapper[4628]: E1211 05:15:38.961735 4628 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 05:15:38 crc kubenswrapper[4628]: E1211 05:15:38.961789 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs podName:68d7972a-8fde-4878-a758-99ed42b3e4c5 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:46.961776056 +0000 UTC m=+49.379122754 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs") pod "network-metrics-daemon-rfts6" (UID: "68d7972a-8fde-4878-a758-99ed42b3e4c5") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.227305 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerStarted","Data":"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085"} Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.227628 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.227659 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.233396 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jbvg4" event={"ID":"12120ff3-d5ea-4737-924f-49f9c0c347b1","Type":"ContainerStarted","Data":"d7cba50d9393d1e5c67f64f17a5bb3420727e395c1fe135f8a59825076b34165"} Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.257766 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.275959 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" podStartSLOduration=8.275933274 podStartE2EDuration="8.275933274s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:39.275440261 +0000 UTC m=+41.692787029" watchObservedRunningTime="2025-12-11 05:15:39.275933274 +0000 UTC m=+41.693280032" Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.339405 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jbvg4" podStartSLOduration=8.339367586 podStartE2EDuration="8.339367586s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:39.338151204 +0000 UTC m=+41.755497952" watchObservedRunningTime="2025-12-11 05:15:39.339367586 +0000 UTC m=+41.756714294" Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.888542 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.888542 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.888637 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.888821 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.889155 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.889029 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.972803 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.972974 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.973006 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973076 4628 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973090 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.973059085 +0000 UTC m=+58.390405803 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973125 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.973116157 +0000 UTC m=+58.390462845 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.973188 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973256 4628 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 05:15:39 crc kubenswrapper[4628]: I1211 05:15:39.973260 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973304 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.973293392 +0000 UTC m=+58.390640090 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973335 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973350 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973362 4628 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973401 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.973390774 +0000 UTC m=+58.390737472 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973439 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973464 4628 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973482 4628 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:39 crc kubenswrapper[4628]: E1211 05:15:39.973545 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.973529328 +0000 UTC m=+58.390876056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 11 05:15:40 crc kubenswrapper[4628]: I1211 05:15:40.237946 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:40 crc kubenswrapper[4628]: I1211 05:15:40.256454 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:15:40 crc kubenswrapper[4628]: I1211 05:15:40.630979 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rfts6"] Dec 11 05:15:40 crc kubenswrapper[4628]: I1211 05:15:40.631120 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:40 crc kubenswrapper[4628]: E1211 05:15:40.631234 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfts6" podUID="68d7972a-8fde-4878-a758-99ed42b3e4c5" Dec 11 05:15:41 crc kubenswrapper[4628]: I1211 05:15:41.888725 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:41 crc kubenswrapper[4628]: I1211 05:15:41.888799 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:41 crc kubenswrapper[4628]: I1211 05:15:41.888865 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:41 crc kubenswrapper[4628]: E1211 05:15:41.889042 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 05:15:41 crc kubenswrapper[4628]: I1211 05:15:41.889392 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:41 crc kubenswrapper[4628]: E1211 05:15:41.889459 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfts6" podUID="68d7972a-8fde-4878-a758-99ed42b3e4c5" Dec 11 05:15:41 crc kubenswrapper[4628]: E1211 05:15:41.889623 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 05:15:41 crc kubenswrapper[4628]: E1211 05:15:41.889771 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 05:15:43 crc kubenswrapper[4628]: I1211 05:15:43.889179 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:43 crc kubenswrapper[4628]: I1211 05:15:43.889477 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:43 crc kubenswrapper[4628]: I1211 05:15:43.889346 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:43 crc kubenswrapper[4628]: I1211 05:15:43.889284 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:43 crc kubenswrapper[4628]: E1211 05:15:43.889617 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 11 05:15:43 crc kubenswrapper[4628]: E1211 05:15:43.889928 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 11 05:15:43 crc kubenswrapper[4628]: E1211 05:15:43.890097 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfts6" podUID="68d7972a-8fde-4878-a758-99ed42b3e4c5" Dec 11 05:15:43 crc kubenswrapper[4628]: E1211 05:15:43.890153 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.546234 4628 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.546767 4628 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.594481 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8sttg"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.595104 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.599259 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.599841 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.603064 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.603684 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.603716 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.608406 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.609570 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.611632 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-n4h96"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.612401 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t7rx5"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.612726 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.613192 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.613520 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lt4q5"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.614297 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.615167 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.615243 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.616251 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.616320 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.616675 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.616719 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.616886 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.616908 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.616954 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.616885 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.617093 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.618255 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.618404 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kbrpw"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.619242 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.620310 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-czhht"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.626186 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.642812 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.642921 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.643872 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.655142 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.655316 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.656086 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.657018 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.659619 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-4nw5h"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.659742 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.662773 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-62v78"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.663171 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.663430 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.663890 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.664025 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-62v78" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.664376 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.664823 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.666880 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.667036 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.667121 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.667257 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.667356 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.667582 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.667653 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.667681 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.667873 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.667937 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.667959 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668002 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.667965 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668246 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wg98b"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668361 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668466 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668573 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668586 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668666 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668718 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668712 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668799 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668881 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668905 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668365 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.669051 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.669155 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.669195 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.669251 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.669295 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.668729 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.669562 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.670077 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.670377 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.670762 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.671664 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.672824 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.674071 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t7rx5"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.674101 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-n4h96"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.674965 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.683255 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-62v78"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.708393 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lt4q5"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.708649 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.708860 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.709392 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.714975 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.715749 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.716057 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.716549 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.717318 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.717935 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.717951 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.718072 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.718124 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.718190 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.718290 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.718302 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.718411 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.718432 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.718520 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.718763 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.718875 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.718943 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.720295 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.718989 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.719042 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.720479 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.721394 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.724306 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.724464 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.724587 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.724601 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.724699 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.724820 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.724837 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.724963 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.724997 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.725085 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.725114 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.725353 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.725475 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.727747 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.728894 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.749231 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.749515 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.749817 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.750402 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.759309 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wg98b"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.759379 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.759435 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.759932 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.760116 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88efb22-2511-4037-9257-102b56de5226-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.760151 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-encryption-config\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.760195 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7210231-68df-4f2b-888f-90827f723bd2-client-ca\") pod \"route-controller-manager-6576b87f9c-ns4xc\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.760243 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12244fa9-e2af-46bc-a35c-acc85884e68b-serving-cert\") pod \"console-operator-58897d9998-t7rx5\" (UID: \"12244fa9-e2af-46bc-a35c-acc85884e68b\") " pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.760283 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a26a57-89a7-4c5c-902c-a19020e4a01a-serving-cert\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.760305 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12244fa9-e2af-46bc-a35c-acc85884e68b-trusted-ca\") pod \"console-operator-58897d9998-t7rx5\" (UID: \"12244fa9-e2af-46bc-a35c-acc85884e68b\") " pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.760326 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.759913 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kbrpw"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.760989 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.760326 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82af3a60-6260-410c-a3c2-16acf3f30bb5-etcd-client\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.761033 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlnll\" (UniqueName: \"kubernetes.io/projected/a2361138-5571-4a9b-8ac9-a0cac66d682a-kube-api-access-xlnll\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.761060 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82af3a60-6260-410c-a3c2-16acf3f30bb5-etcd-ca\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.761085 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66dt2\" (UniqueName: \"kubernetes.io/projected/652211a1-b8d0-427d-b6e0-abf88c891f25-kube-api-access-66dt2\") pod \"cluster-samples-operator-665b6dd947-76fr7\" (UID: \"652211a1-b8d0-427d-b6e0-abf88c891f25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.761111 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/575ba7ec-e024-40c7-be59-44a90232b4f2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lt4q5\" (UID: \"575ba7ec-e024-40c7-be59-44a90232b4f2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.761141 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-config\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.761166 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-etcd-serving-ca\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.761220 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5wr2\" (UniqueName: \"kubernetes.io/projected/575ba7ec-e024-40c7-be59-44a90232b4f2-kube-api-access-t5wr2\") pod \"machine-api-operator-5694c8668f-lt4q5\" (UID: \"575ba7ec-e024-40c7-be59-44a90232b4f2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.761510 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12244fa9-e2af-46bc-a35c-acc85884e68b-config\") pod \"console-operator-58897d9998-t7rx5\" (UID: \"12244fa9-e2af-46bc-a35c-acc85884e68b\") " pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.761545 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-audit\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.761575 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-audit-dir\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762053 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-config\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762124 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2361138-5571-4a9b-8ac9-a0cac66d682a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762311 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2361138-5571-4a9b-8ac9-a0cac66d682a-serving-cert\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762419 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2361138-5571-4a9b-8ac9-a0cac66d682a-audit-dir\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762442 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-node-pullsecrets\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762468 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnr5f\" (UniqueName: \"kubernetes.io/projected/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-kube-api-access-wnr5f\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762506 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a88efb22-2511-4037-9257-102b56de5226-serving-cert\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762531 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzh8\" (UniqueName: \"kubernetes.io/projected/e7210231-68df-4f2b-888f-90827f723bd2-kube-api-access-pdzh8\") pod \"route-controller-manager-6576b87f9c-ns4xc\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762579 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88efb22-2511-4037-9257-102b56de5226-service-ca-bundle\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762606 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnfkc\" (UniqueName: \"kubernetes.io/projected/d5a26a57-89a7-4c5c-902c-a19020e4a01a-kube-api-access-fnfkc\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762635 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82af3a60-6260-410c-a3c2-16acf3f30bb5-config\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762663 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7210231-68df-4f2b-888f-90827f723bd2-config\") pod \"route-controller-manager-6576b87f9c-ns4xc\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762706 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/652211a1-b8d0-427d-b6e0-abf88c891f25-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-76fr7\" (UID: \"652211a1-b8d0-427d-b6e0-abf88c891f25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762723 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a88efb22-2511-4037-9257-102b56de5226-config\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762742 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-trusted-ca-bundle\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762765 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82af3a60-6260-410c-a3c2-16acf3f30bb5-serving-cert\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762783 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/575ba7ec-e024-40c7-be59-44a90232b4f2-images\") pod \"machine-api-operator-5694c8668f-lt4q5\" (UID: \"575ba7ec-e024-40c7-be59-44a90232b4f2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762803 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4zm5\" (UniqueName: \"kubernetes.io/projected/82af3a60-6260-410c-a3c2-16acf3f30bb5-kube-api-access-r4zm5\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.762823 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2361138-5571-4a9b-8ac9-a0cac66d682a-audit-policies\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.763801 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4nw5h"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.764816 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.767749 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.771319 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772009 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7210231-68df-4f2b-888f-90827f723bd2-serving-cert\") pod \"route-controller-manager-6576b87f9c-ns4xc\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772102 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/575ba7ec-e024-40c7-be59-44a90232b4f2-config\") pod \"machine-api-operator-5694c8668f-lt4q5\" (UID: \"575ba7ec-e024-40c7-be59-44a90232b4f2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772136 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82af3a60-6260-410c-a3c2-16acf3f30bb5-etcd-service-ca\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772154 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2361138-5571-4a9b-8ac9-a0cac66d682a-etcd-client\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772178 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-serving-cert\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772205 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a2361138-5571-4a9b-8ac9-a0cac66d682a-encryption-config\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772242 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a2361138-5571-4a9b-8ac9-a0cac66d682a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772262 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-etcd-client\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772294 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-image-import-ca\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772314 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772336 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb8sq\" (UniqueName: \"kubernetes.io/projected/a88efb22-2511-4037-9257-102b56de5226-kube-api-access-jb8sq\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772361 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-client-ca\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772382 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8ppv\" (UniqueName: \"kubernetes.io/projected/12244fa9-e2af-46bc-a35c-acc85884e68b-kube-api-access-q8ppv\") pod \"console-operator-58897d9998-t7rx5\" (UID: \"12244fa9-e2af-46bc-a35c-acc85884e68b\") " pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772561 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8sttg"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.772833 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.773136 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.773350 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.778804 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7kdnp"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.779443 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.779798 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.779955 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.780091 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7kdnp" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.780604 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.780726 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.783053 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.783718 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6qvrg"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.784092 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.784528 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.784826 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.785008 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.787925 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.787977 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-czhht"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.787993 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.792343 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.794600 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.794740 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.801624 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6vrgl"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.802593 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.804202 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zmqdv"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.812551 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8z6mf"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.812770 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zmqdv" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.812876 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.813428 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.816341 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.817950 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.843716 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.844727 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.849276 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.849701 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.850743 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.851137 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.851571 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7kdnp"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.852960 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.858502 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.863744 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.864931 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.865441 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.870964 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7x6xf"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.871268 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.872204 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.872231 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.872311 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7x6xf" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873084 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873112 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30bc62a3-63c3-4cab-bbdf-b790a500d378-metrics-tls\") pod \"dns-operator-744455d44c-62v78\" (UID: \"30bc62a3-63c3-4cab-bbdf-b790a500d378\") " pod="openshift-dns-operator/dns-operator-744455d44c-62v78" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873133 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-audit-dir\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873151 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873182 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7210231-68df-4f2b-888f-90827f723bd2-client-ca\") pod \"route-controller-manager-6576b87f9c-ns4xc\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873204 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873231 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12244fa9-e2af-46bc-a35c-acc85884e68b-serving-cert\") pod \"console-operator-58897d9998-t7rx5\" (UID: \"12244fa9-e2af-46bc-a35c-acc85884e68b\") " pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873252 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlnll\" (UniqueName: \"kubernetes.io/projected/a2361138-5571-4a9b-8ac9-a0cac66d682a-kube-api-access-xlnll\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873270 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873305 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a26a57-89a7-4c5c-902c-a19020e4a01a-serving-cert\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873325 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12244fa9-e2af-46bc-a35c-acc85884e68b-trusted-ca\") pod \"console-operator-58897d9998-t7rx5\" (UID: \"12244fa9-e2af-46bc-a35c-acc85884e68b\") " pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873344 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82af3a60-6260-410c-a3c2-16acf3f30bb5-etcd-client\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873361 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-trusted-ca-bundle\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873381 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82af3a60-6260-410c-a3c2-16acf3f30bb5-etcd-ca\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873402 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66dt2\" (UniqueName: \"kubernetes.io/projected/652211a1-b8d0-427d-b6e0-abf88c891f25-kube-api-access-66dt2\") pod \"cluster-samples-operator-665b6dd947-76fr7\" (UID: \"652211a1-b8d0-427d-b6e0-abf88c891f25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873421 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/575ba7ec-e024-40c7-be59-44a90232b4f2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lt4q5\" (UID: \"575ba7ec-e024-40c7-be59-44a90232b4f2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873440 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873459 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-config\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873476 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-etcd-serving-ca\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873507 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrk5r\" (UniqueName: \"kubernetes.io/projected/3da56a92-c008-4f39-825b-baedf4f3195d-kube-api-access-hrk5r\") pod \"cluster-image-registry-operator-dc59b4c8b-54kds\" (UID: \"3da56a92-c008-4f39-825b-baedf4f3195d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873525 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5wr2\" (UniqueName: \"kubernetes.io/projected/575ba7ec-e024-40c7-be59-44a90232b4f2-kube-api-access-t5wr2\") pod \"machine-api-operator-5694c8668f-lt4q5\" (UID: \"575ba7ec-e024-40c7-be59-44a90232b4f2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873543 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-oauth-serving-cert\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873563 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/53447f0d-9279-4e5f-a63c-a1b050d24b4b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wwzr5\" (UID: \"53447f0d-9279-4e5f-a63c-a1b050d24b4b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873580 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kf9p\" (UniqueName: \"kubernetes.io/projected/5111b417-34a8-405f-a0b8-eab04e144ff8-kube-api-access-7kf9p\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873599 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873617 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12244fa9-e2af-46bc-a35c-acc85884e68b-config\") pod \"console-operator-58897d9998-t7rx5\" (UID: \"12244fa9-e2af-46bc-a35c-acc85884e68b\") " pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873634 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-audit\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873651 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-audit-dir\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873671 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e38faa-255c-42bb-b9c1-faea88d6c989-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vvw9w\" (UID: \"12e38faa-255c-42bb-b9c1-faea88d6c989\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873687 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e38faa-255c-42bb-b9c1-faea88d6c989-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vvw9w\" (UID: \"12e38faa-255c-42bb-b9c1-faea88d6c989\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873705 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-config\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873723 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2361138-5571-4a9b-8ac9-a0cac66d682a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873739 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-node-pullsecrets\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873757 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnr5f\" (UniqueName: \"kubernetes.io/projected/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-kube-api-access-wnr5f\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873775 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2361138-5571-4a9b-8ac9-a0cac66d682a-serving-cert\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873792 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2361138-5571-4a9b-8ac9-a0cac66d682a-audit-dir\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873808 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzh8\" (UniqueName: \"kubernetes.io/projected/e7210231-68df-4f2b-888f-90827f723bd2-kube-api-access-pdzh8\") pod \"route-controller-manager-6576b87f9c-ns4xc\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873825 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a88efb22-2511-4037-9257-102b56de5226-serving-cert\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873870 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88efb22-2511-4037-9257-102b56de5226-service-ca-bundle\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873889 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mncrp\" (UniqueName: \"kubernetes.io/projected/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-kube-api-access-mncrp\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873909 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnfkc\" (UniqueName: \"kubernetes.io/projected/d5a26a57-89a7-4c5c-902c-a19020e4a01a-kube-api-access-fnfkc\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873925 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-audit-policies\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873941 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949f77a9-d70d-45ad-8c60-554afb860a62-config\") pod \"machine-approver-56656f9798-mfqxc\" (UID: \"949f77a9-d70d-45ad-8c60-554afb860a62\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873958 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82af3a60-6260-410c-a3c2-16acf3f30bb5-config\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873974 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7210231-68df-4f2b-888f-90827f723bd2-config\") pod \"route-controller-manager-6576b87f9c-ns4xc\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.873990 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53447f0d-9279-4e5f-a63c-a1b050d24b4b-serving-cert\") pod \"openshift-config-operator-7777fb866f-wwzr5\" (UID: \"53447f0d-9279-4e5f-a63c-a1b050d24b4b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874017 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/652211a1-b8d0-427d-b6e0-abf88c891f25-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-76fr7\" (UID: \"652211a1-b8d0-427d-b6e0-abf88c891f25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874034 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a88efb22-2511-4037-9257-102b56de5226-config\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874052 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-trusted-ca-bundle\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874068 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-service-ca\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874092 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82af3a60-6260-410c-a3c2-16acf3f30bb5-serving-cert\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874107 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/575ba7ec-e024-40c7-be59-44a90232b4f2-images\") pod \"machine-api-operator-5694c8668f-lt4q5\" (UID: \"575ba7ec-e024-40c7-be59-44a90232b4f2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874123 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c9dm\" (UniqueName: \"kubernetes.io/projected/30bc62a3-63c3-4cab-bbdf-b790a500d378-kube-api-access-8c9dm\") pod \"dns-operator-744455d44c-62v78\" (UID: \"30bc62a3-63c3-4cab-bbdf-b790a500d378\") " pod="openshift-dns-operator/dns-operator-744455d44c-62v78" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874147 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4zm5\" (UniqueName: \"kubernetes.io/projected/82af3a60-6260-410c-a3c2-16acf3f30bb5-kube-api-access-r4zm5\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874166 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7ch9\" (UniqueName: \"kubernetes.io/projected/949f77a9-d70d-45ad-8c60-554afb860a62-kube-api-access-w7ch9\") pod \"machine-approver-56656f9798-mfqxc\" (UID: \"949f77a9-d70d-45ad-8c60-554afb860a62\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874183 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/949f77a9-d70d-45ad-8c60-554afb860a62-auth-proxy-config\") pod \"machine-approver-56656f9798-mfqxc\" (UID: \"949f77a9-d70d-45ad-8c60-554afb860a62\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874201 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2361138-5571-4a9b-8ac9-a0cac66d682a-audit-policies\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874215 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7210231-68df-4f2b-888f-90827f723bd2-serving-cert\") pod \"route-controller-manager-6576b87f9c-ns4xc\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874233 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/575ba7ec-e024-40c7-be59-44a90232b4f2-config\") pod \"machine-api-operator-5694c8668f-lt4q5\" (UID: \"575ba7ec-e024-40c7-be59-44a90232b4f2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874250 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5111b417-34a8-405f-a0b8-eab04e144ff8-console-serving-cert\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874268 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874285 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3da56a92-c008-4f39-825b-baedf4f3195d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-54kds\" (UID: \"3da56a92-c008-4f39-825b-baedf4f3195d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874302 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5111b417-34a8-405f-a0b8-eab04e144ff8-console-oauth-config\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874319 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82af3a60-6260-410c-a3c2-16acf3f30bb5-etcd-service-ca\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874337 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2361138-5571-4a9b-8ac9-a0cac66d682a-etcd-client\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874354 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ttnd\" (UniqueName: \"kubernetes.io/projected/53447f0d-9279-4e5f-a63c-a1b050d24b4b-kube-api-access-7ttnd\") pod \"openshift-config-operator-7777fb866f-wwzr5\" (UID: \"53447f0d-9279-4e5f-a63c-a1b050d24b4b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874370 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-serving-cert\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874388 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a2361138-5571-4a9b-8ac9-a0cac66d682a-encryption-config\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874405 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874439 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874456 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-image-import-ca\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874475 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmm65\" (UniqueName: \"kubernetes.io/projected/12e38faa-255c-42bb-b9c1-faea88d6c989-kube-api-access-fmm65\") pod \"openshift-controller-manager-operator-756b6f6bc6-vvw9w\" (UID: \"12e38faa-255c-42bb-b9c1-faea88d6c989\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874492 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874518 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a2361138-5571-4a9b-8ac9-a0cac66d682a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874537 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-etcd-client\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874553 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874570 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874586 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb8sq\" (UniqueName: \"kubernetes.io/projected/a88efb22-2511-4037-9257-102b56de5226-kube-api-access-jb8sq\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874601 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3da56a92-c008-4f39-825b-baedf4f3195d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-54kds\" (UID: \"3da56a92-c008-4f39-825b-baedf4f3195d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874616 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-console-config\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874634 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-client-ca\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874651 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8ppv\" (UniqueName: \"kubernetes.io/projected/12244fa9-e2af-46bc-a35c-acc85884e68b-kube-api-access-q8ppv\") pod \"console-operator-58897d9998-t7rx5\" (UID: \"12244fa9-e2af-46bc-a35c-acc85884e68b\") " pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874670 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3da56a92-c008-4f39-825b-baedf4f3195d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-54kds\" (UID: \"3da56a92-c008-4f39-825b-baedf4f3195d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874688 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/949f77a9-d70d-45ad-8c60-554afb860a62-machine-approver-tls\") pod \"machine-approver-56656f9798-mfqxc\" (UID: \"949f77a9-d70d-45ad-8c60-554afb860a62\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874707 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88efb22-2511-4037-9257-102b56de5226-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.874726 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-encryption-config\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.875079 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-xsldw"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.875932 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7210231-68df-4f2b-888f-90827f723bd2-client-ca\") pod \"route-controller-manager-6576b87f9c-ns4xc\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.877389 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qjp9m"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.878756 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-whf6x"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.880974 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12244fa9-e2af-46bc-a35c-acc85884e68b-serving-cert\") pod \"console-operator-58897d9998-t7rx5\" (UID: \"12244fa9-e2af-46bc-a35c-acc85884e68b\") " pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.881150 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xsldw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.881471 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qjp9m" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.881644 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.884976 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2361138-5571-4a9b-8ac9-a0cac66d682a-audit-dir\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.885788 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-audit\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.885900 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-audit-dir\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.886440 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82af3a60-6260-410c-a3c2-16acf3f30bb5-config\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.887480 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7210231-68df-4f2b-888f-90827f723bd2-config\") pod \"route-controller-manager-6576b87f9c-ns4xc\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.887516 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-config\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.887974 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-node-pullsecrets\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.888599 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2361138-5571-4a9b-8ac9-a0cac66d682a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.889330 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a88efb22-2511-4037-9257-102b56de5226-config\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.889465 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12244fa9-e2af-46bc-a35c-acc85884e68b-trusted-ca\") pod \"console-operator-58897d9998-t7rx5\" (UID: \"12244fa9-e2af-46bc-a35c-acc85884e68b\") " pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.889663 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-trusted-ca-bundle\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.889800 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a26a57-89a7-4c5c-902c-a19020e4a01a-serving-cert\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.890669 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a88efb22-2511-4037-9257-102b56de5226-serving-cert\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.891198 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-image-import-ca\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.892207 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.892243 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.893267 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.893698 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.894142 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12244fa9-e2af-46bc-a35c-acc85884e68b-config\") pod \"console-operator-58897d9998-t7rx5\" (UID: \"12244fa9-e2af-46bc-a35c-acc85884e68b\") " pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.894291 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.898717 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qmzs9"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.899110 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88efb22-2511-4037-9257-102b56de5226-service-ca-bundle\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.899160 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.900159 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.902056 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6vrgl"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.900364 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.899224 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.900768 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-encryption-config\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.901026 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2361138-5571-4a9b-8ac9-a0cac66d682a-audit-policies\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.901205 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82af3a60-6260-410c-a3c2-16acf3f30bb5-etcd-client\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.901668 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/652211a1-b8d0-427d-b6e0-abf88c891f25-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-76fr7\" (UID: \"652211a1-b8d0-427d-b6e0-abf88c891f25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.902754 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a2361138-5571-4a9b-8ac9-a0cac66d682a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.900333 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/575ba7ec-e024-40c7-be59-44a90232b4f2-images\") pod \"machine-api-operator-5694c8668f-lt4q5\" (UID: \"575ba7ec-e024-40c7-be59-44a90232b4f2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.903241 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-serving-cert\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.903436 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-client-ca\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.903541 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82af3a60-6260-410c-a3c2-16acf3f30bb5-serving-cert\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.904160 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7210231-68df-4f2b-888f-90827f723bd2-serving-cert\") pod \"route-controller-manager-6576b87f9c-ns4xc\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.904190 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-etcd-serving-ca\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.904451 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82af3a60-6260-410c-a3c2-16acf3f30bb5-etcd-service-ca\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.904655 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88efb22-2511-4037-9257-102b56de5226-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.904983 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2361138-5571-4a9b-8ac9-a0cac66d682a-serving-cert\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.905131 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82af3a60-6260-410c-a3c2-16acf3f30bb5-etcd-ca\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.905368 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-config\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.905932 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/575ba7ec-e024-40c7-be59-44a90232b4f2-config\") pod \"machine-api-operator-5694c8668f-lt4q5\" (UID: \"575ba7ec-e024-40c7-be59-44a90232b4f2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.906786 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.906958 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.907902 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.908391 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-etcd-client\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.909646 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jcfs4"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.910257 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6qvrg"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.910344 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jcfs4" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.911719 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2361138-5571-4a9b-8ac9-a0cac66d682a-etcd-client\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.911956 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.912302 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a2361138-5571-4a9b-8ac9-a0cac66d682a-encryption-config\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.912360 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/575ba7ec-e024-40c7-be59-44a90232b4f2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lt4q5\" (UID: \"575ba7ec-e024-40c7-be59-44a90232b4f2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.913246 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.913367 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cjb5x"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.914470 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.914574 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.915423 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.916392 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7x6xf"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.917378 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zmqdv"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.918550 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.918972 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.919117 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.920162 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.921250 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qmzs9"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.922701 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.923723 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.924921 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.926305 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.927476 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xsldw"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.928598 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.939103 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.947058 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jcfs4"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.950736 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cjb5x"] Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.957724 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975342 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3da56a92-c008-4f39-825b-baedf4f3195d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-54kds\" (UID: \"3da56a92-c008-4f39-825b-baedf4f3195d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975384 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/949f77a9-d70d-45ad-8c60-554afb860a62-machine-approver-tls\") pod \"machine-approver-56656f9798-mfqxc\" (UID: \"949f77a9-d70d-45ad-8c60-554afb860a62\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975413 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30bc62a3-63c3-4cab-bbdf-b790a500d378-metrics-tls\") pod \"dns-operator-744455d44c-62v78\" (UID: \"30bc62a3-63c3-4cab-bbdf-b790a500d378\") " pod="openshift-dns-operator/dns-operator-744455d44c-62v78" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975442 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-audit-dir\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975469 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975495 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975521 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975577 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975613 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-trusted-ca-bundle\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975642 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrk5r\" (UniqueName: \"kubernetes.io/projected/3da56a92-c008-4f39-825b-baedf4f3195d-kube-api-access-hrk5r\") pod \"cluster-image-registry-operator-dc59b4c8b-54kds\" (UID: \"3da56a92-c008-4f39-825b-baedf4f3195d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975674 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975711 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-oauth-serving-cert\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975731 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/53447f0d-9279-4e5f-a63c-a1b050d24b4b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wwzr5\" (UID: \"53447f0d-9279-4e5f-a63c-a1b050d24b4b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975749 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kf9p\" (UniqueName: \"kubernetes.io/projected/5111b417-34a8-405f-a0b8-eab04e144ff8-kube-api-access-7kf9p\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975765 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975785 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e38faa-255c-42bb-b9c1-faea88d6c989-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vvw9w\" (UID: \"12e38faa-255c-42bb-b9c1-faea88d6c989\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975803 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e38faa-255c-42bb-b9c1-faea88d6c989-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vvw9w\" (UID: \"12e38faa-255c-42bb-b9c1-faea88d6c989\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975871 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-audit-policies\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975891 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mncrp\" (UniqueName: \"kubernetes.io/projected/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-kube-api-access-mncrp\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975910 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949f77a9-d70d-45ad-8c60-554afb860a62-config\") pod \"machine-approver-56656f9798-mfqxc\" (UID: \"949f77a9-d70d-45ad-8c60-554afb860a62\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975929 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53447f0d-9279-4e5f-a63c-a1b050d24b4b-serving-cert\") pod \"openshift-config-operator-7777fb866f-wwzr5\" (UID: \"53447f0d-9279-4e5f-a63c-a1b050d24b4b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975965 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c9dm\" (UniqueName: \"kubernetes.io/projected/30bc62a3-63c3-4cab-bbdf-b790a500d378-kube-api-access-8c9dm\") pod \"dns-operator-744455d44c-62v78\" (UID: \"30bc62a3-63c3-4cab-bbdf-b790a500d378\") " pod="openshift-dns-operator/dns-operator-744455d44c-62v78" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.975981 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-service-ca\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976009 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7ch9\" (UniqueName: \"kubernetes.io/projected/949f77a9-d70d-45ad-8c60-554afb860a62-kube-api-access-w7ch9\") pod \"machine-approver-56656f9798-mfqxc\" (UID: \"949f77a9-d70d-45ad-8c60-554afb860a62\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976031 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/949f77a9-d70d-45ad-8c60-554afb860a62-auth-proxy-config\") pod \"machine-approver-56656f9798-mfqxc\" (UID: \"949f77a9-d70d-45ad-8c60-554afb860a62\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976050 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5111b417-34a8-405f-a0b8-eab04e144ff8-console-serving-cert\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976073 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976093 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ttnd\" (UniqueName: \"kubernetes.io/projected/53447f0d-9279-4e5f-a63c-a1b050d24b4b-kube-api-access-7ttnd\") pod \"openshift-config-operator-7777fb866f-wwzr5\" (UID: \"53447f0d-9279-4e5f-a63c-a1b050d24b4b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976109 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3da56a92-c008-4f39-825b-baedf4f3195d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-54kds\" (UID: \"3da56a92-c008-4f39-825b-baedf4f3195d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976125 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5111b417-34a8-405f-a0b8-eab04e144ff8-console-oauth-config\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976143 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976163 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976189 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmm65\" (UniqueName: \"kubernetes.io/projected/12e38faa-255c-42bb-b9c1-faea88d6c989-kube-api-access-fmm65\") pod \"openshift-controller-manager-operator-756b6f6bc6-vvw9w\" (UID: \"12e38faa-255c-42bb-b9c1-faea88d6c989\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976204 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976223 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976245 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3da56a92-c008-4f39-825b-baedf4f3195d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-54kds\" (UID: \"3da56a92-c008-4f39-825b-baedf4f3195d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.976266 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-console-config\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.977510 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-console-config\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.979070 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-audit-policies\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.979251 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.979597 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.979620 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-trusted-ca-bundle\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.979981 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-audit-dir\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.981503 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e38faa-255c-42bb-b9c1-faea88d6c989-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vvw9w\" (UID: \"12e38faa-255c-42bb-b9c1-faea88d6c989\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.981639 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.981646 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3da56a92-c008-4f39-825b-baedf4f3195d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-54kds\" (UID: \"3da56a92-c008-4f39-825b-baedf4f3195d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.982360 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/949f77a9-d70d-45ad-8c60-554afb860a62-auth-proxy-config\") pod \"machine-approver-56656f9798-mfqxc\" (UID: \"949f77a9-d70d-45ad-8c60-554afb860a62\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.982555 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-service-ca\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.982818 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/949f77a9-d70d-45ad-8c60-554afb860a62-machine-approver-tls\") pod \"machine-approver-56656f9798-mfqxc\" (UID: \"949f77a9-d70d-45ad-8c60-554afb860a62\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.983051 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.983541 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.983826 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949f77a9-d70d-45ad-8c60-554afb860a62-config\") pod \"machine-approver-56656f9798-mfqxc\" (UID: \"949f77a9-d70d-45ad-8c60-554afb860a62\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.983930 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/53447f0d-9279-4e5f-a63c-a1b050d24b4b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wwzr5\" (UID: \"53447f0d-9279-4e5f-a63c-a1b050d24b4b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.984035 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-oauth-serving-cert\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.984233 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30bc62a3-63c3-4cab-bbdf-b790a500d378-metrics-tls\") pod \"dns-operator-744455d44c-62v78\" (UID: \"30bc62a3-63c3-4cab-bbdf-b790a500d378\") " pod="openshift-dns-operator/dns-operator-744455d44c-62v78" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.984562 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.984834 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.984965 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3da56a92-c008-4f39-825b-baedf4f3195d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-54kds\" (UID: \"3da56a92-c008-4f39-825b-baedf4f3195d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.985126 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.985355 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.985602 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.985942 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53447f0d-9279-4e5f-a63c-a1b050d24b4b-serving-cert\") pod \"openshift-config-operator-7777fb866f-wwzr5\" (UID: \"53447f0d-9279-4e5f-a63c-a1b050d24b4b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.986806 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12e38faa-255c-42bb-b9c1-faea88d6c989-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vvw9w\" (UID: \"12e38faa-255c-42bb-b9c1-faea88d6c989\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.986972 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5111b417-34a8-405f-a0b8-eab04e144ff8-console-serving-cert\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.987961 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.989693 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5111b417-34a8-405f-a0b8-eab04e144ff8-console-oauth-config\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.990138 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:44 crc kubenswrapper[4628]: I1211 05:15:44.998270 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.020908 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.040909 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.062401 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.082539 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.099401 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.117964 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.139685 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.159318 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.179880 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.198928 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.218696 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.238979 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.258262 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.278489 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.298628 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.318512 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.338159 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.358506 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.377533 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.397518 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.418758 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.438232 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.458829 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.478639 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.499089 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.534540 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.537228 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.558231 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.578435 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.598677 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.617667 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.638565 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.658904 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.679057 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.698511 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.718905 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.738120 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.761837 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.778781 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.798309 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.819092 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.844498 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.856641 4628 request.go:700] Waited for 1.006152082s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-certs-default&limit=500&resourceVersion=0 Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.859509 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.878018 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.889341 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.889705 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.890012 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.890125 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.899411 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.918118 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.939124 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.957974 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 11 05:15:45 crc kubenswrapper[4628]: I1211 05:15:45.982105 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.007545 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.018619 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.059295 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.079042 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.098109 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.118691 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.139629 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.159348 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.205139 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlnll\" (UniqueName: \"kubernetes.io/projected/a2361138-5571-4a9b-8ac9-a0cac66d682a-kube-api-access-xlnll\") pod \"apiserver-7bbb656c7d-42zdb\" (UID: \"a2361138-5571-4a9b-8ac9-a0cac66d682a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.220105 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.225890 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnr5f\" (UniqueName: \"kubernetes.io/projected/4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76-kube-api-access-wnr5f\") pod \"apiserver-76f77b778f-n4h96\" (UID: \"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76\") " pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.238542 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.258736 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.271947 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.310208 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzh8\" (UniqueName: \"kubernetes.io/projected/e7210231-68df-4f2b-888f-90827f723bd2-kube-api-access-pdzh8\") pod \"route-controller-manager-6576b87f9c-ns4xc\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.321109 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.325603 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnfkc\" (UniqueName: \"kubernetes.io/projected/d5a26a57-89a7-4c5c-902c-a19020e4a01a-kube-api-access-fnfkc\") pod \"controller-manager-879f6c89f-8sttg\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.337915 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.357331 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.378091 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.380822 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.399034 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.431070 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.438162 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4zm5\" (UniqueName: \"kubernetes.io/projected/82af3a60-6260-410c-a3c2-16acf3f30bb5-kube-api-access-r4zm5\") pod \"etcd-operator-b45778765-kbrpw\" (UID: \"82af3a60-6260-410c-a3c2-16acf3f30bb5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.459411 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8ppv\" (UniqueName: \"kubernetes.io/projected/12244fa9-e2af-46bc-a35c-acc85884e68b-kube-api-access-q8ppv\") pod \"console-operator-58897d9998-t7rx5\" (UID: \"12244fa9-e2af-46bc-a35c-acc85884e68b\") " pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.494057 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.495389 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5wr2\" (UniqueName: \"kubernetes.io/projected/575ba7ec-e024-40c7-be59-44a90232b4f2-kube-api-access-t5wr2\") pod \"machine-api-operator-5694c8668f-lt4q5\" (UID: \"575ba7ec-e024-40c7-be59-44a90232b4f2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.505233 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66dt2\" (UniqueName: \"kubernetes.io/projected/652211a1-b8d0-427d-b6e0-abf88c891f25-kube-api-access-66dt2\") pod \"cluster-samples-operator-665b6dd947-76fr7\" (UID: \"652211a1-b8d0-427d-b6e0-abf88c891f25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.515024 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb8sq\" (UniqueName: \"kubernetes.io/projected/a88efb22-2511-4037-9257-102b56de5226-kube-api-access-jb8sq\") pod \"authentication-operator-69f744f599-czhht\" (UID: \"a88efb22-2511-4037-9257-102b56de5226\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.517702 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.537826 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.559622 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.561509 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.579052 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.583136 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.600165 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.617862 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.618281 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.619974 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.638512 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.657911 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.698477 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.721034 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.737666 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.744706 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.758213 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.778103 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.797129 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.817662 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.837682 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.865512 4628 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.876371 4628 request.go:700] Waited for 1.961509501s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.878101 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.921335 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8sttg"] Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.939351 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3da56a92-c008-4f39-825b-baedf4f3195d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-54kds\" (UID: \"3da56a92-c008-4f39-825b-baedf4f3195d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.940062 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-n4h96"] Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.940111 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb"] Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.940130 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t7rx5"] Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.940146 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc"] Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.943969 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ttnd\" (UniqueName: \"kubernetes.io/projected/53447f0d-9279-4e5f-a63c-a1b050d24b4b-kube-api-access-7ttnd\") pod \"openshift-config-operator-7777fb866f-wwzr5\" (UID: \"53447f0d-9279-4e5f-a63c-a1b050d24b4b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.961161 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrk5r\" (UniqueName: \"kubernetes.io/projected/3da56a92-c008-4f39-825b-baedf4f3195d-kube-api-access-hrk5r\") pod \"cluster-image-registry-operator-dc59b4c8b-54kds\" (UID: \"3da56a92-c008-4f39-825b-baedf4f3195d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.966052 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:46 crc kubenswrapper[4628]: I1211 05:15:46.978613 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mncrp\" (UniqueName: \"kubernetes.io/projected/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-kube-api-access-mncrp\") pod \"oauth-openshift-558db77b4-wg98b\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.015413 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs\") pod \"network-metrics-daemon-rfts6\" (UID: \"68d7972a-8fde-4878-a758-99ed42b3e4c5\") " pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.016245 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kf9p\" (UniqueName: \"kubernetes.io/projected/5111b417-34a8-405f-a0b8-eab04e144ff8-kube-api-access-7kf9p\") pod \"console-f9d7485db-4nw5h\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.017022 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c9dm\" (UniqueName: \"kubernetes.io/projected/30bc62a3-63c3-4cab-bbdf-b790a500d378-kube-api-access-8c9dm\") pod \"dns-operator-744455d44c-62v78\" (UID: \"30bc62a3-63c3-4cab-bbdf-b790a500d378\") " pod="openshift-dns-operator/dns-operator-744455d44c-62v78" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.041441 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmm65\" (UniqueName: \"kubernetes.io/projected/12e38faa-255c-42bb-b9c1-faea88d6c989-kube-api-access-fmm65\") pod \"openshift-controller-manager-operator-756b6f6bc6-vvw9w\" (UID: \"12e38faa-255c-42bb-b9c1-faea88d6c989\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.057129 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.057569 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.060354 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7ch9\" (UniqueName: \"kubernetes.io/projected/949f77a9-d70d-45ad-8c60-554afb860a62-kube-api-access-w7ch9\") pod \"machine-approver-56656f9798-mfqxc\" (UID: \"949f77a9-d70d-45ad-8c60-554afb860a62\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.066336 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.077909 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.099180 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.116741 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-62v78" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.117900 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.134272 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68d7972a-8fde-4878-a758-99ed42b3e4c5-metrics-certs\") pod \"network-metrics-daemon-rfts6\" (UID: \"68d7972a-8fde-4878-a758-99ed42b3e4c5\") " pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.140056 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.150798 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfts6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.158104 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7"] Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.164565 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.221325 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc85c6c-0509-4b32-b9eb-fe58f5b9306d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lbf8w\" (UID: \"3fc85c6c-0509-4b32-b9eb-fe58f5b9306d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.221361 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-service-ca-bundle\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.221391 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e53868f-b961-440d-a046-4ef042fddbbf-proxy-tls\") pod \"machine-config-controller-84d6567774-2dgsz\" (UID: \"1e53868f-b961-440d-a046-4ef042fddbbf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.221433 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85e69a46-d878-4004-9a92-c1ccc00000e9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7kjhs\" (UID: \"85e69a46-d878-4004-9a92-c1ccc00000e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.221459 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-bound-sa-token\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.221473 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5-metrics-tls\") pod \"ingress-operator-5b745b69d9-h9ckw\" (UID: \"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.221516 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa963e29-dda2-4d61-827f-2da2d53bfe52-registry-certificates\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.221534 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa963e29-dda2-4d61-827f-2da2d53bfe52-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.221881 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fe8af448-2223-4442-9c1d-2ea4948b0c12-profile-collector-cert\") pod \"catalog-operator-68c6474976-m8br6\" (UID: \"fe8af448-2223-4442-9c1d-2ea4948b0c12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.221982 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h9ckw\" (UID: \"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.222036 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-stats-auth\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.222115 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85e69a46-d878-4004-9a92-c1ccc00000e9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7kjhs\" (UID: \"85e69a46-d878-4004-9a92-c1ccc00000e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.222155 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-registry-tls\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.222209 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159f3336-509f-41f1-ad07-380009d48dd7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-czh2t\" (UID: \"159f3336-509f-41f1-ad07-380009d48dd7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.222268 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/342f0865-c965-4465-b45c-42b0f84af9e1-config-volume\") pod \"dns-default-7kdnp\" (UID: \"342f0865-c965-4465-b45c-42b0f84af9e1\") " pod="openshift-dns/dns-default-7kdnp" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.222308 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/342f0865-c965-4465-b45c-42b0f84af9e1-metrics-tls\") pod \"dns-default-7kdnp\" (UID: \"342f0865-c965-4465-b45c-42b0f84af9e1\") " pod="openshift-dns/dns-default-7kdnp" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.222400 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fe8af448-2223-4442-9c1d-2ea4948b0c12-srv-cert\") pod \"catalog-operator-68c6474976-m8br6\" (UID: \"fe8af448-2223-4442-9c1d-2ea4948b0c12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.223129 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pszvr\" (UniqueName: \"kubernetes.io/projected/342f0865-c965-4465-b45c-42b0f84af9e1-kube-api-access-pszvr\") pod \"dns-default-7kdnp\" (UID: \"342f0865-c965-4465-b45c-42b0f84af9e1\") " pod="openshift-dns/dns-default-7kdnp" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.223197 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nngx6\" (UniqueName: \"kubernetes.io/projected/e22056a0-8001-488d-9dd7-9368d4a459e8-kube-api-access-nngx6\") pod \"control-plane-machine-set-operator-78cbb6b69f-78pgf\" (UID: \"e22056a0-8001-488d-9dd7-9368d4a459e8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.224185 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxvp8\" (UniqueName: \"kubernetes.io/projected/1e53868f-b961-440d-a046-4ef042fddbbf-kube-api-access-gxvp8\") pod \"machine-config-controller-84d6567774-2dgsz\" (UID: \"1e53868f-b961-440d-a046-4ef042fddbbf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.224250 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: E1211 05:15:47.228515 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:47.728485924 +0000 UTC m=+50.145832622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.230043 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgbd6\" (UniqueName: \"kubernetes.io/projected/03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5-kube-api-access-zgbd6\") pod \"ingress-operator-5b745b69d9-h9ckw\" (UID: \"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.230457 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c8cc\" (UniqueName: \"kubernetes.io/projected/cff00c77-bf53-43ce-a5ac-62d5a9264c9b-kube-api-access-6c8cc\") pod \"migrator-59844c95c7-zmqdv\" (UID: \"cff00c77-bf53-43ce-a5ac-62d5a9264c9b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zmqdv" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.230866 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5-trusted-ca\") pod \"ingress-operator-5b745b69d9-h9ckw\" (UID: \"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.231050 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-default-certificate\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.231656 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt77n\" (UniqueName: \"kubernetes.io/projected/159f3336-509f-41f1-ad07-380009d48dd7-kube-api-access-lt77n\") pod \"kube-storage-version-migrator-operator-b67b599dd-czh2t\" (UID: \"159f3336-509f-41f1-ad07-380009d48dd7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.238511 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkcsh\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-kube-api-access-nkcsh\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.239760 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa963e29-dda2-4d61-827f-2da2d53bfe52-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.240611 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fc85c6c-0509-4b32-b9eb-fe58f5b9306d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lbf8w\" (UID: \"3fc85c6c-0509-4b32-b9eb-fe58f5b9306d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.240827 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce2358a9-7f38-41e7-ba92-b82e8e98b458-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6vrgl\" (UID: \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\") " pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241086 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/231b7565-2d1b-4c1e-be8e-8f1d1dd3f558-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xwptc\" (UID: \"231b7565-2d1b-4c1e-be8e-8f1d1dd3f558\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241212 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef42e00-ffb9-43db-b82e-ee36516674ce-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k4bsx\" (UID: \"8ef42e00-ffb9-43db-b82e-ee36516674ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241306 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ef42e00-ffb9-43db-b82e-ee36516674ce-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k4bsx\" (UID: \"8ef42e00-ffb9-43db-b82e-ee36516674ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241383 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/231b7565-2d1b-4c1e-be8e-8f1d1dd3f558-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xwptc\" (UID: \"231b7565-2d1b-4c1e-be8e-8f1d1dd3f558\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241420 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e53868f-b961-440d-a046-4ef042fddbbf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2dgsz\" (UID: \"1e53868f-b961-440d-a046-4ef042fddbbf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241455 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85e69a46-d878-4004-9a92-c1ccc00000e9-config\") pod \"kube-apiserver-operator-766d6c64bb-7kjhs\" (UID: \"85e69a46-d878-4004-9a92-c1ccc00000e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241486 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e22056a0-8001-488d-9dd7-9368d4a459e8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-78pgf\" (UID: \"e22056a0-8001-488d-9dd7-9368d4a459e8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241549 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nwg6\" (UniqueName: \"kubernetes.io/projected/ce2358a9-7f38-41e7-ba92-b82e8e98b458-kube-api-access-6nwg6\") pod \"marketplace-operator-79b997595-6vrgl\" (UID: \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\") " pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241580 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4wg4\" (UniqueName: \"kubernetes.io/projected/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-kube-api-access-b4wg4\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241616 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa963e29-dda2-4d61-827f-2da2d53bfe52-trusted-ca\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241727 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2x4\" (UniqueName: \"kubernetes.io/projected/fe8af448-2223-4442-9c1d-2ea4948b0c12-kube-api-access-xz2x4\") pod \"catalog-operator-68c6474976-m8br6\" (UID: \"fe8af448-2223-4442-9c1d-2ea4948b0c12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241771 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/159f3336-509f-41f1-ad07-380009d48dd7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-czh2t\" (UID: \"159f3336-509f-41f1-ad07-380009d48dd7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241817 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/231b7565-2d1b-4c1e-be8e-8f1d1dd3f558-config\") pod \"kube-controller-manager-operator-78b949d7b-xwptc\" (UID: \"231b7565-2d1b-4c1e-be8e-8f1d1dd3f558\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241891 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b9qd\" (UniqueName: \"kubernetes.io/projected/3fc85c6c-0509-4b32-b9eb-fe58f5b9306d-kube-api-access-7b9qd\") pod \"openshift-apiserver-operator-796bbdcf4f-lbf8w\" (UID: \"3fc85c6c-0509-4b32-b9eb-fe58f5b9306d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241919 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-metrics-certs\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.241971 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef42e00-ffb9-43db-b82e-ee36516674ce-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k4bsx\" (UID: \"8ef42e00-ffb9-43db-b82e-ee36516674ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.242032 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce2358a9-7f38-41e7-ba92-b82e8e98b458-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6vrgl\" (UID: \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\") " pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.284513 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.284645 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kbrpw"] Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.309868 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-t7rx5" event={"ID":"12244fa9-e2af-46bc-a35c-acc85884e68b","Type":"ContainerStarted","Data":"eb022a706dab705272fd196c7bc618243e5009b3ab16b7b1da78abb4bde87b57"} Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.336787 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.341966 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-n4h96" event={"ID":"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76","Type":"ContainerStarted","Data":"149068a4abf94225ff73fc09f82d18db1e0dc835c9c6efe60cb91996b65a9c34"} Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.342725 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:47 crc kubenswrapper[4628]: E1211 05:15:47.342990 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:47.842911326 +0000 UTC m=+50.260258024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343130 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fe8af448-2223-4442-9c1d-2ea4948b0c12-profile-collector-cert\") pod \"catalog-operator-68c6474976-m8br6\" (UID: \"fe8af448-2223-4442-9c1d-2ea4948b0c12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343190 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/209cebdd-7761-42a6-9bf1-089cc06c3dca-config-volume\") pod \"collect-profiles-29423835-4gtch\" (UID: \"209cebdd-7761-42a6-9bf1-089cc06c3dca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343250 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h9ckw\" (UID: \"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343298 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcplr\" (UniqueName: \"kubernetes.io/projected/b07b0692-4062-4a52-8689-be350bf137cb-kube-api-access-tcplr\") pod \"machine-config-server-qjp9m\" (UID: \"b07b0692-4062-4a52-8689-be350bf137cb\") " pod="openshift-machine-config-operator/machine-config-server-qjp9m" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343346 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d32e4c2-7e36-4cf7-8369-982164414c7b-webhook-cert\") pod \"packageserver-d55dfcdfc-whb54\" (UID: \"4d32e4c2-7e36-4cf7-8369-982164414c7b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343392 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-stats-auth\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343422 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4d32e4c2-7e36-4cf7-8369-982164414c7b-tmpfs\") pod \"packageserver-d55dfcdfc-whb54\" (UID: \"4d32e4c2-7e36-4cf7-8369-982164414c7b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343460 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85e69a46-d878-4004-9a92-c1ccc00000e9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7kjhs\" (UID: \"85e69a46-d878-4004-9a92-c1ccc00000e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343498 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgkkc\" (UniqueName: \"kubernetes.io/projected/1eefadbf-ac92-4b97-999e-fb262b5d45c2-kube-api-access-kgkkc\") pod \"cni-sysctl-allowlist-ds-whf6x\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343538 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-registry-tls\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343571 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159f3336-509f-41f1-ad07-380009d48dd7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-czh2t\" (UID: \"159f3336-509f-41f1-ad07-380009d48dd7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343610 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/342f0865-c965-4465-b45c-42b0f84af9e1-config-volume\") pod \"dns-default-7kdnp\" (UID: \"342f0865-c965-4465-b45c-42b0f84af9e1\") " pod="openshift-dns/dns-default-7kdnp" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343650 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/342f0865-c965-4465-b45c-42b0f84af9e1-metrics-tls\") pod \"dns-default-7kdnp\" (UID: \"342f0865-c965-4465-b45c-42b0f84af9e1\") " pod="openshift-dns/dns-default-7kdnp" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343694 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fe8af448-2223-4442-9c1d-2ea4948b0c12-srv-cert\") pod \"catalog-operator-68c6474976-m8br6\" (UID: \"fe8af448-2223-4442-9c1d-2ea4948b0c12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343732 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a9a689c-7b60-4ab3-b842-62e7b3dca41c-serving-cert\") pod \"service-ca-operator-777779d784-xv2pd\" (UID: \"8a9a689c-7b60-4ab3-b842-62e7b3dca41c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343773 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b07b0692-4062-4a52-8689-be350bf137cb-node-bootstrap-token\") pod \"machine-config-server-qjp9m\" (UID: \"b07b0692-4062-4a52-8689-be350bf137cb\") " pod="openshift-machine-config-operator/machine-config-server-qjp9m" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.343818 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pszvr\" (UniqueName: \"kubernetes.io/projected/342f0865-c965-4465-b45c-42b0f84af9e1-kube-api-access-pszvr\") pod \"dns-default-7kdnp\" (UID: \"342f0865-c965-4465-b45c-42b0f84af9e1\") " pod="openshift-dns/dns-default-7kdnp" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.344094 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nngx6\" (UniqueName: \"kubernetes.io/projected/e22056a0-8001-488d-9dd7-9368d4a459e8-kube-api-access-nngx6\") pod \"control-plane-machine-set-operator-78cbb6b69f-78pgf\" (UID: \"e22056a0-8001-488d-9dd7-9368d4a459e8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.344769 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/159f3336-509f-41f1-ad07-380009d48dd7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-czh2t\" (UID: \"159f3336-509f-41f1-ad07-380009d48dd7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.345456 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5zdv\" (UniqueName: \"kubernetes.io/projected/a11e465b-c6bf-465d-877f-0f09de58c651-kube-api-access-n5zdv\") pod \"olm-operator-6b444d44fb-lbkqk\" (UID: \"a11e465b-c6bf-465d-877f-0f09de58c651\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.345500 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-mountpoint-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.345518 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-registration-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.345574 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxvp8\" (UniqueName: \"kubernetes.io/projected/1e53868f-b961-440d-a046-4ef042fddbbf-kube-api-access-gxvp8\") pod \"machine-config-controller-84d6567774-2dgsz\" (UID: \"1e53868f-b961-440d-a046-4ef042fddbbf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347073 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a11e465b-c6bf-465d-877f-0f09de58c651-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lbkqk\" (UID: \"a11e465b-c6bf-465d-877f-0f09de58c651\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347142 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8np\" (UniqueName: \"kubernetes.io/projected/a73e280a-008c-4e72-8844-375de50d4222-kube-api-access-4k8np\") pod \"downloads-7954f5f757-xsldw\" (UID: \"a73e280a-008c-4e72-8844-375de50d4222\") " pod="openshift-console/downloads-7954f5f757-xsldw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347424 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8brqr\" (UniqueName: \"kubernetes.io/projected/7bde36a3-0865-4d8d-a741-4efc59e3b409-kube-api-access-8brqr\") pod \"multus-admission-controller-857f4d67dd-7x6xf\" (UID: \"7bde36a3-0865-4d8d-a741-4efc59e3b409\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7x6xf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347499 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347550 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b07b0692-4062-4a52-8689-be350bf137cb-certs\") pod \"machine-config-server-qjp9m\" (UID: \"b07b0692-4062-4a52-8689-be350bf137cb\") " pod="openshift-machine-config-operator/machine-config-server-qjp9m" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347587 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c8cc\" (UniqueName: \"kubernetes.io/projected/cff00c77-bf53-43ce-a5ac-62d5a9264c9b-kube-api-access-6c8cc\") pod \"migrator-59844c95c7-zmqdv\" (UID: \"cff00c77-bf53-43ce-a5ac-62d5a9264c9b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zmqdv" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347634 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5-trusted-ca\") pod \"ingress-operator-5b745b69d9-h9ckw\" (UID: \"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347662 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgbd6\" (UniqueName: \"kubernetes.io/projected/03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5-kube-api-access-zgbd6\") pod \"ingress-operator-5b745b69d9-h9ckw\" (UID: \"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347714 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-default-certificate\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347749 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b513f403-38c4-40af-a4b5-13df12b7f807-images\") pod \"machine-config-operator-74547568cd-mhkq4\" (UID: \"b513f403-38c4-40af-a4b5-13df12b7f807\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347793 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1eefadbf-ac92-4b97-999e-fb262b5d45c2-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-whf6x\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347818 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfxng\" (UniqueName: \"kubernetes.io/projected/12356db6-09ae-438c-a085-6b26ea3b97e8-kube-api-access-zfxng\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347880 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt77n\" (UniqueName: \"kubernetes.io/projected/159f3336-509f-41f1-ad07-380009d48dd7-kube-api-access-lt77n\") pod \"kube-storage-version-migrator-operator-b67b599dd-czh2t\" (UID: \"159f3336-509f-41f1-ad07-380009d48dd7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.347906 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6-signing-key\") pod \"service-ca-9c57cc56f-qmzs9\" (UID: \"ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6\") " pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.348296 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/342f0865-c965-4465-b45c-42b0f84af9e1-config-volume\") pod \"dns-default-7kdnp\" (UID: \"342f0865-c965-4465-b45c-42b0f84af9e1\") " pod="openshift-dns/dns-default-7kdnp" Dec 11 05:15:47 crc kubenswrapper[4628]: E1211 05:15:47.348548 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:47.848534896 +0000 UTC m=+50.265881594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.350430 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5-trusted-ca\") pod \"ingress-operator-5b745b69d9-h9ckw\" (UID: \"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.348454 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbd6r\" (UniqueName: \"kubernetes.io/projected/ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6-kube-api-access-tbd6r\") pod \"service-ca-9c57cc56f-qmzs9\" (UID: \"ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6\") " pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.350483 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d35c515-c721-4484-bef7-80e52206f26f-cert\") pod \"ingress-canary-jcfs4\" (UID: \"8d35c515-c721-4484-bef7-80e52206f26f\") " pod="openshift-ingress-canary/ingress-canary-jcfs4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.350536 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkcsh\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-kube-api-access-nkcsh\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.350562 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-csi-data-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.350611 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa963e29-dda2-4d61-827f-2da2d53bfe52-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.350653 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fc85c6c-0509-4b32-b9eb-fe58f5b9306d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lbf8w\" (UID: \"3fc85c6c-0509-4b32-b9eb-fe58f5b9306d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.350729 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-socket-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.350777 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxv6g\" (UniqueName: \"kubernetes.io/projected/209cebdd-7761-42a6-9bf1-089cc06c3dca-kube-api-access-fxv6g\") pod \"collect-profiles-29423835-4gtch\" (UID: \"209cebdd-7761-42a6-9bf1-089cc06c3dca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.350814 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce2358a9-7f38-41e7-ba92-b82e8e98b458-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6vrgl\" (UID: \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\") " pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.350875 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/231b7565-2d1b-4c1e-be8e-8f1d1dd3f558-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xwptc\" (UID: \"231b7565-2d1b-4c1e-be8e-8f1d1dd3f558\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.350922 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1eefadbf-ac92-4b97-999e-fb262b5d45c2-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-whf6x\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.350986 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef42e00-ffb9-43db-b82e-ee36516674ce-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k4bsx\" (UID: \"8ef42e00-ffb9-43db-b82e-ee36516674ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.351034 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a11e465b-c6bf-465d-877f-0f09de58c651-srv-cert\") pod \"olm-operator-6b444d44fb-lbkqk\" (UID: \"a11e465b-c6bf-465d-877f-0f09de58c651\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.351084 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/231b7565-2d1b-4c1e-be8e-8f1d1dd3f558-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xwptc\" (UID: \"231b7565-2d1b-4c1e-be8e-8f1d1dd3f558\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.351142 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ef42e00-ffb9-43db-b82e-ee36516674ce-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k4bsx\" (UID: \"8ef42e00-ffb9-43db-b82e-ee36516674ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.351754 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e53868f-b961-440d-a046-4ef042fddbbf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2dgsz\" (UID: \"1e53868f-b961-440d-a046-4ef042fddbbf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.351790 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgqsl\" (UniqueName: \"kubernetes.io/projected/b513f403-38c4-40af-a4b5-13df12b7f807-kube-api-access-mgqsl\") pod \"machine-config-operator-74547568cd-mhkq4\" (UID: \"b513f403-38c4-40af-a4b5-13df12b7f807\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.351855 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85e69a46-d878-4004-9a92-c1ccc00000e9-config\") pod \"kube-apiserver-operator-766d6c64bb-7kjhs\" (UID: \"85e69a46-d878-4004-9a92-c1ccc00000e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.351889 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nwg6\" (UniqueName: \"kubernetes.io/projected/ce2358a9-7f38-41e7-ba92-b82e8e98b458-kube-api-access-6nwg6\") pod \"marketplace-operator-79b997595-6vrgl\" (UID: \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\") " pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.351947 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e22056a0-8001-488d-9dd7-9368d4a459e8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-78pgf\" (UID: \"e22056a0-8001-488d-9dd7-9368d4a459e8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.351997 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4wg4\" (UniqueName: \"kubernetes.io/projected/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-kube-api-access-b4wg4\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.352030 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6-signing-cabundle\") pod \"service-ca-9c57cc56f-qmzs9\" (UID: \"ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6\") " pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.352072 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" event={"ID":"d5a26a57-89a7-4c5c-902c-a19020e4a01a","Type":"ContainerStarted","Data":"c9d6d32a5bf1c8bcb6592efb173becfddc354f0563c037dbc40b1eca4c0c1ffa"} Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.352093 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa963e29-dda2-4d61-827f-2da2d53bfe52-trusted-ca\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.352122 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2x4\" (UniqueName: \"kubernetes.io/projected/fe8af448-2223-4442-9c1d-2ea4948b0c12-kube-api-access-xz2x4\") pod \"catalog-operator-68c6474976-m8br6\" (UID: \"fe8af448-2223-4442-9c1d-2ea4948b0c12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.352221 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcw8s\" (UniqueName: \"kubernetes.io/projected/8d35c515-c721-4484-bef7-80e52206f26f-kube-api-access-fcw8s\") pod \"ingress-canary-jcfs4\" (UID: \"8d35c515-c721-4484-bef7-80e52206f26f\") " pod="openshift-ingress-canary/ingress-canary-jcfs4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.352274 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/231b7565-2d1b-4c1e-be8e-8f1d1dd3f558-config\") pod \"kube-controller-manager-operator-78b949d7b-xwptc\" (UID: \"231b7565-2d1b-4c1e-be8e-8f1d1dd3f558\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.352305 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/159f3336-509f-41f1-ad07-380009d48dd7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-czh2t\" (UID: \"159f3336-509f-41f1-ad07-380009d48dd7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.352358 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b513f403-38c4-40af-a4b5-13df12b7f807-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mhkq4\" (UID: \"b513f403-38c4-40af-a4b5-13df12b7f807\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.352393 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b9qd\" (UniqueName: \"kubernetes.io/projected/3fc85c6c-0509-4b32-b9eb-fe58f5b9306d-kube-api-access-7b9qd\") pod \"openshift-apiserver-operator-796bbdcf4f-lbf8w\" (UID: \"3fc85c6c-0509-4b32-b9eb-fe58f5b9306d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.353234 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1eefadbf-ac92-4b97-999e-fb262b5d45c2-ready\") pod \"cni-sysctl-allowlist-ds-whf6x\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.353263 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef42e00-ffb9-43db-b82e-ee36516674ce-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k4bsx\" (UID: \"8ef42e00-ffb9-43db-b82e-ee36516674ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.353287 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-metrics-certs\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.353310 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce2358a9-7f38-41e7-ba92-b82e8e98b458-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6vrgl\" (UID: \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\") " pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.353357 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/209cebdd-7761-42a6-9bf1-089cc06c3dca-secret-volume\") pod \"collect-profiles-29423835-4gtch\" (UID: \"209cebdd-7761-42a6-9bf1-089cc06c3dca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.353385 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a9a689c-7b60-4ab3-b842-62e7b3dca41c-config\") pod \"service-ca-operator-777779d784-xv2pd\" (UID: \"8a9a689c-7b60-4ab3-b842-62e7b3dca41c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.353574 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce2358a9-7f38-41e7-ba92-b82e8e98b458-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6vrgl\" (UID: \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\") " pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.353932 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7mm9\" (UniqueName: \"kubernetes.io/projected/8a9a689c-7b60-4ab3-b842-62e7b3dca41c-kube-api-access-v7mm9\") pod \"service-ca-operator-777779d784-xv2pd\" (UID: \"8a9a689c-7b60-4ab3-b842-62e7b3dca41c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.354818 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc85c6c-0509-4b32-b9eb-fe58f5b9306d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lbf8w\" (UID: \"3fc85c6c-0509-4b32-b9eb-fe58f5b9306d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.354881 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7bde36a3-0865-4d8d-a741-4efc59e3b409-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7x6xf\" (UID: \"7bde36a3-0865-4d8d-a741-4efc59e3b409\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7x6xf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.354911 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e53868f-b961-440d-a046-4ef042fddbbf-proxy-tls\") pod \"machine-config-controller-84d6567774-2dgsz\" (UID: \"1e53868f-b961-440d-a046-4ef042fddbbf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.354930 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-service-ca-bundle\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.354956 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kstk\" (UniqueName: \"kubernetes.io/projected/4d32e4c2-7e36-4cf7-8369-982164414c7b-kube-api-access-8kstk\") pod \"packageserver-d55dfcdfc-whb54\" (UID: \"4d32e4c2-7e36-4cf7-8369-982164414c7b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.354985 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85e69a46-d878-4004-9a92-c1ccc00000e9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7kjhs\" (UID: \"85e69a46-d878-4004-9a92-c1ccc00000e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.355098 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b513f403-38c4-40af-a4b5-13df12b7f807-proxy-tls\") pod \"machine-config-operator-74547568cd-mhkq4\" (UID: \"b513f403-38c4-40af-a4b5-13df12b7f807\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.355105 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa963e29-dda2-4d61-827f-2da2d53bfe52-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.355133 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-plugins-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.355159 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-bound-sa-token\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.355193 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5-metrics-tls\") pod \"ingress-operator-5b745b69d9-h9ckw\" (UID: \"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.355216 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckqms\" (UniqueName: \"kubernetes.io/projected/1d7951da-3429-4ac6-b797-208606682d8e-kube-api-access-ckqms\") pod \"package-server-manager-789f6589d5-f954d\" (UID: \"1d7951da-3429-4ac6-b797-208606682d8e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.355242 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d7951da-3429-4ac6-b797-208606682d8e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-f954d\" (UID: \"1d7951da-3429-4ac6-b797-208606682d8e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.355265 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d32e4c2-7e36-4cf7-8369-982164414c7b-apiservice-cert\") pod \"packageserver-d55dfcdfc-whb54\" (UID: \"4d32e4c2-7e36-4cf7-8369-982164414c7b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.355293 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa963e29-dda2-4d61-827f-2da2d53bfe52-registry-certificates\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.355359 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa963e29-dda2-4d61-827f-2da2d53bfe52-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.356007 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-registry-tls\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.357172 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/342f0865-c965-4465-b45c-42b0f84af9e1-metrics-tls\") pod \"dns-default-7kdnp\" (UID: \"342f0865-c965-4465-b45c-42b0f84af9e1\") " pod="openshift-dns/dns-default-7kdnp" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.358384 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-default-certificate\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.358751 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.359759 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85e69a46-d878-4004-9a92-c1ccc00000e9-config\") pod \"kube-apiserver-operator-766d6c64bb-7kjhs\" (UID: \"85e69a46-d878-4004-9a92-c1ccc00000e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.360322 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-service-ca-bundle\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.360782 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e53868f-b961-440d-a046-4ef042fddbbf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-2dgsz\" (UID: \"1e53868f-b961-440d-a046-4ef042fddbbf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.361347 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc85c6c-0509-4b32-b9eb-fe58f5b9306d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lbf8w\" (UID: \"3fc85c6c-0509-4b32-b9eb-fe58f5b9306d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.361391 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/231b7565-2d1b-4c1e-be8e-8f1d1dd3f558-config\") pod \"kube-controller-manager-operator-78b949d7b-xwptc\" (UID: \"231b7565-2d1b-4c1e-be8e-8f1d1dd3f558\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.363701 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef42e00-ffb9-43db-b82e-ee36516674ce-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k4bsx\" (UID: \"8ef42e00-ffb9-43db-b82e-ee36516674ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.364606 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa963e29-dda2-4d61-827f-2da2d53bfe52-trusted-ca\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.366391 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa963e29-dda2-4d61-827f-2da2d53bfe52-registry-certificates\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.370113 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fc85c6c-0509-4b32-b9eb-fe58f5b9306d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lbf8w\" (UID: \"3fc85c6c-0509-4b32-b9eb-fe58f5b9306d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.370335 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" event={"ID":"e7210231-68df-4f2b-888f-90827f723bd2","Type":"ContainerStarted","Data":"d08f39bc9250bf57aab70d4639a0fe533b75a52f3aebed1c3c4b6980c8efd834"} Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.370375 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" event={"ID":"e7210231-68df-4f2b-888f-90827f723bd2","Type":"ContainerStarted","Data":"76760e00231de5b4f5c15f721f8952d34cef584ffd2033118fe03c6600327a25"} Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.370657 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fe8af448-2223-4442-9c1d-2ea4948b0c12-profile-collector-cert\") pod \"catalog-operator-68c6474976-m8br6\" (UID: \"fe8af448-2223-4442-9c1d-2ea4948b0c12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.372357 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.372454 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce2358a9-7f38-41e7-ba92-b82e8e98b458-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6vrgl\" (UID: \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\") " pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.372714 4628 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ns4xc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.372747 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" podUID="e7210231-68df-4f2b-888f-90827f723bd2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.375942 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa963e29-dda2-4d61-827f-2da2d53bfe52-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.377373 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" event={"ID":"a2361138-5571-4a9b-8ac9-a0cac66d682a","Type":"ContainerStarted","Data":"839e71f0a980b8d2da1e58c3c3f9c17b0343396045b7169b8beda5b912a16db0"} Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.378130 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fe8af448-2223-4442-9c1d-2ea4948b0c12-srv-cert\") pod \"catalog-operator-68c6474976-m8br6\" (UID: \"fe8af448-2223-4442-9c1d-2ea4948b0c12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.378522 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/231b7565-2d1b-4c1e-be8e-8f1d1dd3f558-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-xwptc\" (UID: \"231b7565-2d1b-4c1e-be8e-8f1d1dd3f558\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.379491 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85e69a46-d878-4004-9a92-c1ccc00000e9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7kjhs\" (UID: \"85e69a46-d878-4004-9a92-c1ccc00000e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.380658 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-stats-auth\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.381672 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef42e00-ffb9-43db-b82e-ee36516674ce-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k4bsx\" (UID: \"8ef42e00-ffb9-43db-b82e-ee36516674ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.386211 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-metrics-certs\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.386340 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5-metrics-tls\") pod \"ingress-operator-5b745b69d9-h9ckw\" (UID: \"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.386657 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e53868f-b961-440d-a046-4ef042fddbbf-proxy-tls\") pod \"machine-config-controller-84d6567774-2dgsz\" (UID: \"1e53868f-b961-440d-a046-4ef042fddbbf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.387752 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e22056a0-8001-488d-9dd7-9368d4a459e8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-78pgf\" (UID: \"e22056a0-8001-488d-9dd7-9368d4a459e8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.392340 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/159f3336-509f-41f1-ad07-380009d48dd7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-czh2t\" (UID: \"159f3336-509f-41f1-ad07-380009d48dd7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.405761 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85e69a46-d878-4004-9a92-c1ccc00000e9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7kjhs\" (UID: \"85e69a46-d878-4004-9a92-c1ccc00000e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.412271 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.421973 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h9ckw\" (UID: \"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.440405 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pszvr\" (UniqueName: \"kubernetes.io/projected/342f0865-c965-4465-b45c-42b0f84af9e1-kube-api-access-pszvr\") pod \"dns-default-7kdnp\" (UID: \"342f0865-c965-4465-b45c-42b0f84af9e1\") " pod="openshift-dns/dns-default-7kdnp" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.455881 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456129 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcplr\" (UniqueName: \"kubernetes.io/projected/b07b0692-4062-4a52-8689-be350bf137cb-kube-api-access-tcplr\") pod \"machine-config-server-qjp9m\" (UID: \"b07b0692-4062-4a52-8689-be350bf137cb\") " pod="openshift-machine-config-operator/machine-config-server-qjp9m" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456163 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d32e4c2-7e36-4cf7-8369-982164414c7b-webhook-cert\") pod \"packageserver-d55dfcdfc-whb54\" (UID: \"4d32e4c2-7e36-4cf7-8369-982164414c7b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456181 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4d32e4c2-7e36-4cf7-8369-982164414c7b-tmpfs\") pod \"packageserver-d55dfcdfc-whb54\" (UID: \"4d32e4c2-7e36-4cf7-8369-982164414c7b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456197 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgkkc\" (UniqueName: \"kubernetes.io/projected/1eefadbf-ac92-4b97-999e-fb262b5d45c2-kube-api-access-kgkkc\") pod \"cni-sysctl-allowlist-ds-whf6x\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456220 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a9a689c-7b60-4ab3-b842-62e7b3dca41c-serving-cert\") pod \"service-ca-operator-777779d784-xv2pd\" (UID: \"8a9a689c-7b60-4ab3-b842-62e7b3dca41c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456237 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b07b0692-4062-4a52-8689-be350bf137cb-node-bootstrap-token\") pod \"machine-config-server-qjp9m\" (UID: \"b07b0692-4062-4a52-8689-be350bf137cb\") " pod="openshift-machine-config-operator/machine-config-server-qjp9m" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456261 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5zdv\" (UniqueName: \"kubernetes.io/projected/a11e465b-c6bf-465d-877f-0f09de58c651-kube-api-access-n5zdv\") pod \"olm-operator-6b444d44fb-lbkqk\" (UID: \"a11e465b-c6bf-465d-877f-0f09de58c651\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456276 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-mountpoint-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456292 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-registration-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456321 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a11e465b-c6bf-465d-877f-0f09de58c651-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lbkqk\" (UID: \"a11e465b-c6bf-465d-877f-0f09de58c651\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456339 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k8np\" (UniqueName: \"kubernetes.io/projected/a73e280a-008c-4e72-8844-375de50d4222-kube-api-access-4k8np\") pod \"downloads-7954f5f757-xsldw\" (UID: \"a73e280a-008c-4e72-8844-375de50d4222\") " pod="openshift-console/downloads-7954f5f757-xsldw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456358 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8brqr\" (UniqueName: \"kubernetes.io/projected/7bde36a3-0865-4d8d-a741-4efc59e3b409-kube-api-access-8brqr\") pod \"multus-admission-controller-857f4d67dd-7x6xf\" (UID: \"7bde36a3-0865-4d8d-a741-4efc59e3b409\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7x6xf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456381 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b07b0692-4062-4a52-8689-be350bf137cb-certs\") pod \"machine-config-server-qjp9m\" (UID: \"b07b0692-4062-4a52-8689-be350bf137cb\") " pod="openshift-machine-config-operator/machine-config-server-qjp9m" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456423 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b513f403-38c4-40af-a4b5-13df12b7f807-images\") pod \"machine-config-operator-74547568cd-mhkq4\" (UID: \"b513f403-38c4-40af-a4b5-13df12b7f807\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456438 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1eefadbf-ac92-4b97-999e-fb262b5d45c2-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-whf6x\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456453 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfxng\" (UniqueName: \"kubernetes.io/projected/12356db6-09ae-438c-a085-6b26ea3b97e8-kube-api-access-zfxng\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456483 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6-signing-key\") pod \"service-ca-9c57cc56f-qmzs9\" (UID: \"ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6\") " pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456498 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbd6r\" (UniqueName: \"kubernetes.io/projected/ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6-kube-api-access-tbd6r\") pod \"service-ca-9c57cc56f-qmzs9\" (UID: \"ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6\") " pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456513 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d35c515-c721-4484-bef7-80e52206f26f-cert\") pod \"ingress-canary-jcfs4\" (UID: \"8d35c515-c721-4484-bef7-80e52206f26f\") " pod="openshift-ingress-canary/ingress-canary-jcfs4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456535 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-csi-data-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456553 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-socket-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456569 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxv6g\" (UniqueName: \"kubernetes.io/projected/209cebdd-7761-42a6-9bf1-089cc06c3dca-kube-api-access-fxv6g\") pod \"collect-profiles-29423835-4gtch\" (UID: \"209cebdd-7761-42a6-9bf1-089cc06c3dca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456587 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a11e465b-c6bf-465d-877f-0f09de58c651-srv-cert\") pod \"olm-operator-6b444d44fb-lbkqk\" (UID: \"a11e465b-c6bf-465d-877f-0f09de58c651\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456602 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1eefadbf-ac92-4b97-999e-fb262b5d45c2-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-whf6x\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456643 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgqsl\" (UniqueName: \"kubernetes.io/projected/b513f403-38c4-40af-a4b5-13df12b7f807-kube-api-access-mgqsl\") pod \"machine-config-operator-74547568cd-mhkq4\" (UID: \"b513f403-38c4-40af-a4b5-13df12b7f807\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456673 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6-signing-cabundle\") pod \"service-ca-9c57cc56f-qmzs9\" (UID: \"ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6\") " pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456701 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcw8s\" (UniqueName: \"kubernetes.io/projected/8d35c515-c721-4484-bef7-80e52206f26f-kube-api-access-fcw8s\") pod \"ingress-canary-jcfs4\" (UID: \"8d35c515-c721-4484-bef7-80e52206f26f\") " pod="openshift-ingress-canary/ingress-canary-jcfs4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456737 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b513f403-38c4-40af-a4b5-13df12b7f807-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mhkq4\" (UID: \"b513f403-38c4-40af-a4b5-13df12b7f807\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456756 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1eefadbf-ac92-4b97-999e-fb262b5d45c2-ready\") pod \"cni-sysctl-allowlist-ds-whf6x\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456772 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/209cebdd-7761-42a6-9bf1-089cc06c3dca-secret-volume\") pod \"collect-profiles-29423835-4gtch\" (UID: \"209cebdd-7761-42a6-9bf1-089cc06c3dca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456791 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a9a689c-7b60-4ab3-b842-62e7b3dca41c-config\") pod \"service-ca-operator-777779d784-xv2pd\" (UID: \"8a9a689c-7b60-4ab3-b842-62e7b3dca41c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456826 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7mm9\" (UniqueName: \"kubernetes.io/projected/8a9a689c-7b60-4ab3-b842-62e7b3dca41c-kube-api-access-v7mm9\") pod \"service-ca-operator-777779d784-xv2pd\" (UID: \"8a9a689c-7b60-4ab3-b842-62e7b3dca41c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456859 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7bde36a3-0865-4d8d-a741-4efc59e3b409-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7x6xf\" (UID: \"7bde36a3-0865-4d8d-a741-4efc59e3b409\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7x6xf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456878 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kstk\" (UniqueName: \"kubernetes.io/projected/4d32e4c2-7e36-4cf7-8369-982164414c7b-kube-api-access-8kstk\") pod \"packageserver-d55dfcdfc-whb54\" (UID: \"4d32e4c2-7e36-4cf7-8369-982164414c7b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456897 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b513f403-38c4-40af-a4b5-13df12b7f807-proxy-tls\") pod \"machine-config-operator-74547568cd-mhkq4\" (UID: \"b513f403-38c4-40af-a4b5-13df12b7f807\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456914 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-plugins-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456942 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckqms\" (UniqueName: \"kubernetes.io/projected/1d7951da-3429-4ac6-b797-208606682d8e-kube-api-access-ckqms\") pod \"package-server-manager-789f6589d5-f954d\" (UID: \"1d7951da-3429-4ac6-b797-208606682d8e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456957 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d7951da-3429-4ac6-b797-208606682d8e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-f954d\" (UID: \"1d7951da-3429-4ac6-b797-208606682d8e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456972 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d32e4c2-7e36-4cf7-8369-982164414c7b-apiservice-cert\") pod \"packageserver-d55dfcdfc-whb54\" (UID: \"4d32e4c2-7e36-4cf7-8369-982164414c7b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.456989 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/209cebdd-7761-42a6-9bf1-089cc06c3dca-config-volume\") pod \"collect-profiles-29423835-4gtch\" (UID: \"209cebdd-7761-42a6-9bf1-089cc06c3dca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:15:47 crc kubenswrapper[4628]: E1211 05:15:47.458923 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:47.958893759 +0000 UTC m=+50.376240457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.458914 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/209cebdd-7761-42a6-9bf1-089cc06c3dca-config-volume\") pod \"collect-profiles-29423835-4gtch\" (UID: \"209cebdd-7761-42a6-9bf1-089cc06c3dca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.459031 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-csi-data-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.459252 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-socket-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.459695 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1eefadbf-ac92-4b97-999e-fb262b5d45c2-ready\") pod \"cni-sysctl-allowlist-ds-whf6x\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.461426 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4d32e4c2-7e36-4cf7-8369-982164414c7b-tmpfs\") pod \"packageserver-d55dfcdfc-whb54\" (UID: \"4d32e4c2-7e36-4cf7-8369-982164414c7b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.462300 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1eefadbf-ac92-4b97-999e-fb262b5d45c2-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-whf6x\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.463236 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a9a689c-7b60-4ab3-b842-62e7b3dca41c-config\") pod \"service-ca-operator-777779d784-xv2pd\" (UID: \"8a9a689c-7b60-4ab3-b842-62e7b3dca41c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.463792 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b513f403-38c4-40af-a4b5-13df12b7f807-images\") pod \"machine-config-operator-74547568cd-mhkq4\" (UID: \"b513f403-38c4-40af-a4b5-13df12b7f807\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.469430 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-mountpoint-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.471040 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-czhht"] Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.473175 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-plugins-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.478025 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lt4q5"] Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.480942 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1eefadbf-ac92-4b97-999e-fb262b5d45c2-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-whf6x\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.481541 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b513f403-38c4-40af-a4b5-13df12b7f807-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mhkq4\" (UID: \"b513f403-38c4-40af-a4b5-13df12b7f807\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.482181 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/12356db6-09ae-438c-a085-6b26ea3b97e8-registration-dir\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.494313 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nngx6\" (UniqueName: \"kubernetes.io/projected/e22056a0-8001-488d-9dd7-9368d4a459e8-kube-api-access-nngx6\") pod \"control-plane-machine-set-operator-78cbb6b69f-78pgf\" (UID: \"e22056a0-8001-488d-9dd7-9368d4a459e8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.494715 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6-signing-cabundle\") pod \"service-ca-9c57cc56f-qmzs9\" (UID: \"ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6\") " pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.510583 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a11e465b-c6bf-465d-877f-0f09de58c651-srv-cert\") pod \"olm-operator-6b444d44fb-lbkqk\" (UID: \"a11e465b-c6bf-465d-877f-0f09de58c651\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.511046 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/209cebdd-7761-42a6-9bf1-089cc06c3dca-secret-volume\") pod \"collect-profiles-29423835-4gtch\" (UID: \"209cebdd-7761-42a6-9bf1-089cc06c3dca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.511342 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b513f403-38c4-40af-a4b5-13df12b7f807-proxy-tls\") pod \"machine-config-operator-74547568cd-mhkq4\" (UID: \"b513f403-38c4-40af-a4b5-13df12b7f807\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.511654 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7bde36a3-0865-4d8d-a741-4efc59e3b409-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7x6xf\" (UID: \"7bde36a3-0865-4d8d-a741-4efc59e3b409\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7x6xf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.512051 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d32e4c2-7e36-4cf7-8369-982164414c7b-webhook-cert\") pod \"packageserver-d55dfcdfc-whb54\" (UID: \"4d32e4c2-7e36-4cf7-8369-982164414c7b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.516875 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b07b0692-4062-4a52-8689-be350bf137cb-certs\") pod \"machine-config-server-qjp9m\" (UID: \"b07b0692-4062-4a52-8689-be350bf137cb\") " pod="openshift-machine-config-operator/machine-config-server-qjp9m" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.519098 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d35c515-c721-4484-bef7-80e52206f26f-cert\") pod \"ingress-canary-jcfs4\" (UID: \"8d35c515-c721-4484-bef7-80e52206f26f\") " pod="openshift-ingress-canary/ingress-canary-jcfs4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.519108 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a9a689c-7b60-4ab3-b842-62e7b3dca41c-serving-cert\") pod \"service-ca-operator-777779d784-xv2pd\" (UID: \"8a9a689c-7b60-4ab3-b842-62e7b3dca41c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.519583 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d7951da-3429-4ac6-b797-208606682d8e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-f954d\" (UID: \"1d7951da-3429-4ac6-b797-208606682d8e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.519644 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b07b0692-4062-4a52-8689-be350bf137cb-node-bootstrap-token\") pod \"machine-config-server-qjp9m\" (UID: \"b07b0692-4062-4a52-8689-be350bf137cb\") " pod="openshift-machine-config-operator/machine-config-server-qjp9m" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.520258 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6-signing-key\") pod \"service-ca-9c57cc56f-qmzs9\" (UID: \"ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6\") " pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.521178 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a11e465b-c6bf-465d-877f-0f09de58c651-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lbkqk\" (UID: \"a11e465b-c6bf-465d-877f-0f09de58c651\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.522181 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d32e4c2-7e36-4cf7-8369-982164414c7b-apiservice-cert\") pod \"packageserver-d55dfcdfc-whb54\" (UID: \"4d32e4c2-7e36-4cf7-8369-982164414c7b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.520925 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxvp8\" (UniqueName: \"kubernetes.io/projected/1e53868f-b961-440d-a046-4ef042fddbbf-kube-api-access-gxvp8\") pod \"machine-config-controller-84d6567774-2dgsz\" (UID: \"1e53868f-b961-440d-a046-4ef042fddbbf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.532124 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgbd6\" (UniqueName: \"kubernetes.io/projected/03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5-kube-api-access-zgbd6\") pod \"ingress-operator-5b745b69d9-h9ckw\" (UID: \"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.551314 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c8cc\" (UniqueName: \"kubernetes.io/projected/cff00c77-bf53-43ce-a5ac-62d5a9264c9b-kube-api-access-6c8cc\") pod \"migrator-59844c95c7-zmqdv\" (UID: \"cff00c77-bf53-43ce-a5ac-62d5a9264c9b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zmqdv" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.554784 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wg98b"] Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.558581 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: E1211 05:15:47.559017 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:48.059005999 +0000 UTC m=+50.476352697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.575372 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt77n\" (UniqueName: \"kubernetes.io/projected/159f3336-509f-41f1-ad07-380009d48dd7-kube-api-access-lt77n\") pod \"kube-storage-version-migrator-operator-b67b599dd-czh2t\" (UID: \"159f3336-509f-41f1-ad07-380009d48dd7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.587397 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nwg6\" (UniqueName: \"kubernetes.io/projected/ce2358a9-7f38-41e7-ba92-b82e8e98b458-kube-api-access-6nwg6\") pod \"marketplace-operator-79b997595-6vrgl\" (UID: \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\") " pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.597765 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkcsh\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-kube-api-access-nkcsh\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.598879 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds"] Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.620357 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/231b7565-2d1b-4c1e-be8e-8f1d1dd3f558-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-xwptc\" (UID: \"231b7565-2d1b-4c1e-be8e-8f1d1dd3f558\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.628123 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b9qd\" (UniqueName: \"kubernetes.io/projected/3fc85c6c-0509-4b32-b9eb-fe58f5b9306d-kube-api-access-7b9qd\") pod \"openshift-apiserver-operator-796bbdcf4f-lbf8w\" (UID: \"3fc85c6c-0509-4b32-b9eb-fe58f5b9306d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.635678 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ef42e00-ffb9-43db-b82e-ee36516674ce-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-k4bsx\" (UID: \"8ef42e00-ffb9-43db-b82e-ee36516674ce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.659331 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:47 crc kubenswrapper[4628]: E1211 05:15:47.659534 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:48.159498999 +0000 UTC m=+50.576845697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.659895 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: E1211 05:15:47.660281 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:48.160250109 +0000 UTC m=+50.577596807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.670373 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2x4\" (UniqueName: \"kubernetes.io/projected/fe8af448-2223-4442-9c1d-2ea4948b0c12-kube-api-access-xz2x4\") pod \"catalog-operator-68c6474976-m8br6\" (UID: \"fe8af448-2223-4442-9c1d-2ea4948b0c12\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.670899 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.678587 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4wg4\" (UniqueName: \"kubernetes.io/projected/b1dac6ca-2acb-4ec2-bd04-c307aa26c17f-kube-api-access-b4wg4\") pod \"router-default-5444994796-8z6mf\" (UID: \"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f\") " pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.685959 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7kdnp" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.696923 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.703780 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.705458 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-bound-sa-token\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.737106 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.739173 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcplr\" (UniqueName: \"kubernetes.io/projected/b07b0692-4062-4a52-8689-be350bf137cb-kube-api-access-tcplr\") pod \"machine-config-server-qjp9m\" (UID: \"b07b0692-4062-4a52-8689-be350bf137cb\") " pod="openshift-machine-config-operator/machine-config-server-qjp9m" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.748698 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zmqdv" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.753418 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.763446 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:47 crc kubenswrapper[4628]: E1211 05:15:47.763817 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:48.26379875 +0000 UTC m=+50.681145448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.767419 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.769177 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxv6g\" (UniqueName: \"kubernetes.io/projected/209cebdd-7761-42a6-9bf1-089cc06c3dca-kube-api-access-fxv6g\") pod \"collect-profiles-29423835-4gtch\" (UID: \"209cebdd-7761-42a6-9bf1-089cc06c3dca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.784089 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.784683 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k8np\" (UniqueName: \"kubernetes.io/projected/a73e280a-008c-4e72-8844-375de50d4222-kube-api-access-4k8np\") pod \"downloads-7954f5f757-xsldw\" (UID: \"a73e280a-008c-4e72-8844-375de50d4222\") " pod="openshift-console/downloads-7954f5f757-xsldw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.790620 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.799744 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.805562 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.808816 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgkkc\" (UniqueName: \"kubernetes.io/projected/1eefadbf-ac92-4b97-999e-fb262b5d45c2-kube-api-access-kgkkc\") pod \"cni-sysctl-allowlist-ds-whf6x\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.837752 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8brqr\" (UniqueName: \"kubernetes.io/projected/7bde36a3-0865-4d8d-a741-4efc59e3b409-kube-api-access-8brqr\") pod \"multus-admission-controller-857f4d67dd-7x6xf\" (UID: \"7bde36a3-0865-4d8d-a741-4efc59e3b409\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7x6xf" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.837852 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xsldw" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.840631 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbd6r\" (UniqueName: \"kubernetes.io/projected/ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6-kube-api-access-tbd6r\") pod \"service-ca-9c57cc56f-qmzs9\" (UID: \"ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6\") " pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.847988 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qjp9m" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.850728 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rfts6"] Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.860669 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7mm9\" (UniqueName: \"kubernetes.io/projected/8a9a689c-7b60-4ab3-b842-62e7b3dca41c-kube-api-access-v7mm9\") pod \"service-ca-operator-777779d784-xv2pd\" (UID: \"8a9a689c-7b60-4ab3-b842-62e7b3dca41c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.861107 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.864748 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.869600 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:15:47 crc kubenswrapper[4628]: E1211 05:15:47.871065 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:48.371045171 +0000 UTC m=+50.788391869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.891059 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.914987 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kstk\" (UniqueName: \"kubernetes.io/projected/4d32e4c2-7e36-4cf7-8369-982164414c7b-kube-api-access-8kstk\") pod \"packageserver-d55dfcdfc-whb54\" (UID: \"4d32e4c2-7e36-4cf7-8369-982164414c7b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.917479 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4nw5h"] Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.920068 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckqms\" (UniqueName: \"kubernetes.io/projected/1d7951da-3429-4ac6-b797-208606682d8e-kube-api-access-ckqms\") pod \"package-server-manager-789f6589d5-f954d\" (UID: \"1d7951da-3429-4ac6-b797-208606682d8e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.935773 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.947889 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5zdv\" (UniqueName: \"kubernetes.io/projected/a11e465b-c6bf-465d-877f-0f09de58c651-kube-api-access-n5zdv\") pod \"olm-operator-6b444d44fb-lbkqk\" (UID: \"a11e465b-c6bf-465d-877f-0f09de58c651\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.957584 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfxng\" (UniqueName: \"kubernetes.io/projected/12356db6-09ae-438c-a085-6b26ea3b97e8-kube-api-access-zfxng\") pod \"csi-hostpathplugin-cjb5x\" (UID: \"12356db6-09ae-438c-a085-6b26ea3b97e8\") " pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.960251 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgqsl\" (UniqueName: \"kubernetes.io/projected/b513f403-38c4-40af-a4b5-13df12b7f807-kube-api-access-mgqsl\") pod \"machine-config-operator-74547568cd-mhkq4\" (UID: \"b513f403-38c4-40af-a4b5-13df12b7f807\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.973951 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:47 crc kubenswrapper[4628]: E1211 05:15:47.974611 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:48.474595142 +0000 UTC m=+50.891941840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:47 crc kubenswrapper[4628]: I1211 05:15:47.997676 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcw8s\" (UniqueName: \"kubernetes.io/projected/8d35c515-c721-4484-bef7-80e52206f26f-kube-api-access-fcw8s\") pod \"ingress-canary-jcfs4\" (UID: \"8d35c515-c721-4484-bef7-80e52206f26f\") " pod="openshift-ingress-canary/ingress-canary-jcfs4" Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.021015 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5"] Dec 11 05:15:48 crc kubenswrapper[4628]: W1211 05:15:48.031895 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d7972a_8fde_4878_a758_99ed42b3e4c5.slice/crio-184a3bbcf043fad975714f9abc9cbb6053d2b0cf7190f57f0c589e80b7f6d2d7 WatchSource:0}: Error finding container 184a3bbcf043fad975714f9abc9cbb6053d2b0cf7190f57f0c589e80b7f6d2d7: Status 404 returned error can't find the container with id 184a3bbcf043fad975714f9abc9cbb6053d2b0cf7190f57f0c589e80b7f6d2d7 Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.075679 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:48 crc kubenswrapper[4628]: E1211 05:15:48.076226 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:48.576212822 +0000 UTC m=+50.993559520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.112693 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.121775 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.132722 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7x6xf" Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.134537 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-62v78"] Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.136043 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w"] Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.175930 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.178017 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:48 crc kubenswrapper[4628]: E1211 05:15:48.178476 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:48.678450688 +0000 UTC m=+51.095797386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.182389 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.194481 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs"] Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.196096 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jcfs4" Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.218352 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.279576 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:48 crc kubenswrapper[4628]: E1211 05:15:48.279838 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:48.779826433 +0000 UTC m=+51.197173131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:48 crc kubenswrapper[4628]: W1211 05:15:48.351929 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85e69a46_d878_4004_9a92_c1ccc00000e9.slice/crio-b3c75c1be587934e36b7647253969c302b937146abbd61dc2f95aefb45abea9e WatchSource:0}: Error finding container b3c75c1be587934e36b7647253969c302b937146abbd61dc2f95aefb45abea9e: Status 404 returned error can't find the container with id b3c75c1be587934e36b7647253969c302b937146abbd61dc2f95aefb45abea9e Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.381205 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:48 crc kubenswrapper[4628]: E1211 05:15:48.381592 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:48.881575336 +0000 UTC m=+51.298922034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.441305 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-62v78" event={"ID":"30bc62a3-63c3-4cab-bbdf-b790a500d378","Type":"ContainerStarted","Data":"b623ffc54005210e280f6c9e16c728b3d8ec3a3cdab9e9a622d46c1317b82962"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.442991 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-t7rx5" event={"ID":"12244fa9-e2af-46bc-a35c-acc85884e68b","Type":"ContainerStarted","Data":"ea5672208cb49c738c04f55d8a4a97b9687c6e769081e2cf03417e39bd56c1df"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.443764 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.459928 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" event={"ID":"12e38faa-255c-42bb-b9c1-faea88d6c989","Type":"ContainerStarted","Data":"5d6eabcf29b226a9ab6f8e1498b7b7d3e1fca372bdd60f92977342c0f25aae60"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.479528 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8z6mf" event={"ID":"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f","Type":"ContainerStarted","Data":"f0b05956938cb4a908f5b3b4ad9bce721294f4c06894b1ea602fa66c53d399bc"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.487648 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:48 crc kubenswrapper[4628]: E1211 05:15:48.488258 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:48.98824015 +0000 UTC m=+51.405586848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.534307 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" event={"ID":"d5a26a57-89a7-4c5c-902c-a19020e4a01a","Type":"ContainerStarted","Data":"b63f7f7ad748d56c454956681b0aca20e830c39035f125d22e91829a5ca681dd"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.535151 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.537489 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" event={"ID":"a88efb22-2511-4037-9257-102b56de5226","Type":"ContainerStarted","Data":"b3469c6501382bb73bab22c38af4d4b2baa9419f0d85cb3f80213f57423113df"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.537525 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" event={"ID":"a88efb22-2511-4037-9257-102b56de5226","Type":"ContainerStarted","Data":"112552a1ed4a9ba8643ac35c0f911a07b0f6e010f3355df863ee62c1df66294e"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.539997 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" event={"ID":"949f77a9-d70d-45ad-8c60-554afb860a62","Type":"ContainerStarted","Data":"89b50fe4d61a4e10c845945dc931be30fc171e88c9cd9f051f5d1cb9fcb8df0d"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.540025 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" event={"ID":"949f77a9-d70d-45ad-8c60-554afb860a62","Type":"ContainerStarted","Data":"14e6c6c2726caf7f5c59c0cc08d245056e1a83d4a9c1450747ed20a8546f999d"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.540738 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" event={"ID":"85e69a46-d878-4004-9a92-c1ccc00000e9","Type":"ContainerStarted","Data":"b3c75c1be587934e36b7647253969c302b937146abbd61dc2f95aefb45abea9e"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.541570 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" event={"ID":"53447f0d-9279-4e5f-a63c-a1b050d24b4b","Type":"ContainerStarted","Data":"d16aef54ec73c597be94dafa28134714d0263e7770ad82c17aa940fb20d83bf8"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.561605 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" event={"ID":"82af3a60-6260-410c-a3c2-16acf3f30bb5","Type":"ContainerStarted","Data":"aff68c6c32fd0c3636c549f427d30ae198c52ac6ea9aa7fdc3811347bf6095a4"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.561652 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" event={"ID":"82af3a60-6260-410c-a3c2-16acf3f30bb5","Type":"ContainerStarted","Data":"ec3828f094b797509546bad7f014cf53227f5271515a36b04e1f0c9370a8e53d"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.564319 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" event={"ID":"575ba7ec-e024-40c7-be59-44a90232b4f2","Type":"ContainerStarted","Data":"c2bdb6f32afae1c6b1a5b30c2320d95720fbf2f589af320ab72f8b1216e78c7d"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.564341 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" event={"ID":"575ba7ec-e024-40c7-be59-44a90232b4f2","Type":"ContainerStarted","Data":"7e0419fcc2ae79f11bfc300a0adda157edc6e6b8c4b6a2505fe0e85aa3734147"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.565030 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rfts6" event={"ID":"68d7972a-8fde-4878-a758-99ed42b3e4c5","Type":"ContainerStarted","Data":"184a3bbcf043fad975714f9abc9cbb6053d2b0cf7190f57f0c589e80b7f6d2d7"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.566662 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7" event={"ID":"652211a1-b8d0-427d-b6e0-abf88c891f25","Type":"ContainerStarted","Data":"8273503a93763d578c84007144da6bb49a3b85f156343970833980a4465697b1"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.566706 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7" event={"ID":"652211a1-b8d0-427d-b6e0-abf88c891f25","Type":"ContainerStarted","Data":"f081aacdd7141a1e25aacefe18ebf67058bc9e04579aca8686deed43d992de33"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.590213 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:48 crc kubenswrapper[4628]: E1211 05:15:48.591502 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:49.091482384 +0000 UTC m=+51.508829092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.594987 4628 generic.go:334] "Generic (PLEG): container finished" podID="a2361138-5571-4a9b-8ac9-a0cac66d682a" containerID="ed19e6fc2f0ccd4f5187d71e02a27f67954a694233f54b7ee7c8a1b5ef9832c7" exitCode=0 Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.595311 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" event={"ID":"a2361138-5571-4a9b-8ac9-a0cac66d682a","Type":"ContainerDied","Data":"ed19e6fc2f0ccd4f5187d71e02a27f67954a694233f54b7ee7c8a1b5ef9832c7"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.605076 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" event={"ID":"3da56a92-c008-4f39-825b-baedf4f3195d","Type":"ContainerStarted","Data":"cac3e054e70a14fbc440e0530baf471479485c71086e43549da104e46cbbac5c"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.607976 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw"] Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.609534 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.626292 4628 generic.go:334] "Generic (PLEG): container finished" podID="4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76" containerID="4a72e2d9cb09b021880bdf7afe6cd05fe185d75d2f782fcf12d046d63c5db97c" exitCode=0 Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.626379 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-n4h96" event={"ID":"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76","Type":"ContainerDied","Data":"4a72e2d9cb09b021880bdf7afe6cd05fe185d75d2f782fcf12d046d63c5db97c"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.635790 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx"] Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.653606 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" event={"ID":"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb","Type":"ContainerStarted","Data":"33a4cc2d8be09644a4a333c502c4103ffca7e1af9575fd8afe0f1161dab11cac"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.673013 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4nw5h" event={"ID":"5111b417-34a8-405f-a0b8-eab04e144ff8","Type":"ContainerStarted","Data":"333e7251ed6902bdf36ca2c25036a4cf58182e2132c20fe49bd99e863b186a17"} Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.684479 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:15:48 crc kubenswrapper[4628]: W1211 05:15:48.688198 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03b99e5a_1844_4a5f_a449_b4c6d6ca1ae5.slice/crio-2bad51406b07cec34c2511871f1fa4afa0a4f4ea8568cda11b8819d526d62ec1 WatchSource:0}: Error finding container 2bad51406b07cec34c2511871f1fa4afa0a4f4ea8568cda11b8819d526d62ec1: Status 404 returned error can't find the container with id 2bad51406b07cec34c2511871f1fa4afa0a4f4ea8568cda11b8819d526d62ec1 Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.691483 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:48 crc kubenswrapper[4628]: E1211 05:15:48.694686 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:49.194663886 +0000 UTC m=+51.612010704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.731042 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz"] Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.792284 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:48 crc kubenswrapper[4628]: E1211 05:15:48.793627 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:49.293611984 +0000 UTC m=+51.710958702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.855414 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf"] Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.893687 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:48 crc kubenswrapper[4628]: E1211 05:15:48.894320 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:49.394289609 +0000 UTC m=+51.811636307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:48 crc kubenswrapper[4628]: I1211 05:15:48.997371 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:48 crc kubenswrapper[4628]: E1211 05:15:48.997697 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:49.497682616 +0000 UTC m=+51.915029314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:49 crc kubenswrapper[4628]: W1211 05:15:49.022971 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e53868f_b961_440d_a046_4ef042fddbbf.slice/crio-7ad14fac342741f17db5efd00c46774f1f3e61af6137b02e3116fc4c86d184fc WatchSource:0}: Error finding container 7ad14fac342741f17db5efd00c46774f1f3e61af6137b02e3116fc4c86d184fc: Status 404 returned error can't find the container with id 7ad14fac342741f17db5efd00c46774f1f3e61af6137b02e3116fc4c86d184fc Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.068758 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-t7rx5" Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.119164 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:49 crc kubenswrapper[4628]: E1211 05:15:49.119433 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:49.619422422 +0000 UTC m=+52.036769120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.234547 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:49 crc kubenswrapper[4628]: E1211 05:15:49.235231 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:49.735215721 +0000 UTC m=+52.152562419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.340792 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:49 crc kubenswrapper[4628]: E1211 05:15:49.341093 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:49.841080394 +0000 UTC m=+52.258427092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.443505 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:49 crc kubenswrapper[4628]: E1211 05:15:49.443917 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:49.943901106 +0000 UTC m=+52.361247804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.502530 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" podStartSLOduration=18.502513359 podStartE2EDuration="18.502513359s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:49.500494245 +0000 UTC m=+51.917840943" watchObservedRunningTime="2025-12-11 05:15:49.502513359 +0000 UTC m=+51.919860057" Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.547515 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:49 crc kubenswrapper[4628]: E1211 05:15:49.548209 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:50.048196877 +0000 UTC m=+52.465543575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.650443 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:49 crc kubenswrapper[4628]: E1211 05:15:49.650626 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:50.150610679 +0000 UTC m=+52.567957377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.650785 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:49 crc kubenswrapper[4628]: E1211 05:15:49.651139 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:50.151131863 +0000 UTC m=+52.568478561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.677942 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc"] Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.690063 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t"] Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.691945 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w"] Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.760444 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:49 crc kubenswrapper[4628]: E1211 05:15:49.760982 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:50.260963521 +0000 UTC m=+52.678310209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.770319 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" event={"ID":"575ba7ec-e024-40c7-be59-44a90232b4f2","Type":"ContainerStarted","Data":"751f19c1d39b87bcfa3b4a5a04d932f6c093aa49efeaa692a1ee67a4688b2519"} Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.784238 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" podStartSLOduration=19.784212662 podStartE2EDuration="19.784212662s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:49.768493702 +0000 UTC m=+52.185840400" watchObservedRunningTime="2025-12-11 05:15:49.784212662 +0000 UTC m=+52.201559360" Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.784653 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6vrgl"] Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.796500 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" event={"ID":"1eefadbf-ac92-4b97-999e-fb262b5d45c2","Type":"ContainerStarted","Data":"8cdde9458cef48bbd9c4dbe1b5ebc960e1d346cf76c9c20a6bed01e037e156c8"} Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.802333 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" event={"ID":"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5","Type":"ContainerStarted","Data":"2bad51406b07cec34c2511871f1fa4afa0a4f4ea8568cda11b8819d526d62ec1"} Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.820230 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" event={"ID":"8ef42e00-ffb9-43db-b82e-ee36516674ce","Type":"ContainerStarted","Data":"75c4a5a154ced0559d0f11dc8574e3b772b59a352e453d157f21701d22eb00b3"} Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.831653 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf" event={"ID":"e22056a0-8001-488d-9dd7-9368d4a459e8","Type":"ContainerStarted","Data":"de9b8366e372870f60c7929f42666dd382f159b09fdd02f7c80bdb96d766705c"} Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.834308 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" event={"ID":"1e53868f-b961-440d-a046-4ef042fddbbf","Type":"ContainerStarted","Data":"7ad14fac342741f17db5efd00c46774f1f3e61af6137b02e3116fc4c86d184fc"} Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.850963 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7" event={"ID":"652211a1-b8d0-427d-b6e0-abf88c891f25","Type":"ContainerStarted","Data":"5128891f4da19c9301df5215ef1b76c49f206570d5130012316a5c742ea959f8"} Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.861655 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:49 crc kubenswrapper[4628]: E1211 05:15:49.863257 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:50.363238829 +0000 UTC m=+52.780585607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:49 crc kubenswrapper[4628]: I1211 05:15:49.965814 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:49 crc kubenswrapper[4628]: E1211 05:15:49.967759 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:50.467733866 +0000 UTC m=+52.885080564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.028488 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6"] Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.038957 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zmqdv"] Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.043907 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-kbrpw" podStartSLOduration=19.043888847 podStartE2EDuration="19.043888847s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:50.028313952 +0000 UTC m=+52.445660650" watchObservedRunningTime="2025-12-11 05:15:50.043888847 +0000 UTC m=+52.461235545" Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.052272 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qjp9m" event={"ID":"b07b0692-4062-4a52-8689-be350bf137cb","Type":"ContainerStarted","Data":"f85cf76f65a8710d555932aeb3064f3799efcf4021e17ddf6aad52b8d284cd16"} Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.068043 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:50 crc kubenswrapper[4628]: E1211 05:15:50.068402 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:50.56838153 +0000 UTC m=+52.985728328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.130095 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" podStartSLOduration=20.130080605 podStartE2EDuration="20.130080605s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:50.099694905 +0000 UTC m=+52.517041603" watchObservedRunningTime="2025-12-11 05:15:50.130080605 +0000 UTC m=+52.547427303" Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.173288 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:50 crc kubenswrapper[4628]: E1211 05:15:50.174434 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:50.674415498 +0000 UTC m=+53.091762196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.219715 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-t7rx5" podStartSLOduration=20.219694455 podStartE2EDuration="20.219694455s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:50.212533204 +0000 UTC m=+52.629879902" watchObservedRunningTime="2025-12-11 05:15:50.219694455 +0000 UTC m=+52.637041153" Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.282288 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:50 crc kubenswrapper[4628]: E1211 05:15:50.282577 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:50.782568192 +0000 UTC m=+53.199914890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:50 crc kubenswrapper[4628]: W1211 05:15:50.376303 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce2358a9_7f38_41e7_ba92_b82e8e98b458.slice/crio-0424bead60a9b2dc8491f36eec8d0cdbce4a923654d43cc0322ba04b83ef11a1 WatchSource:0}: Error finding container 0424bead60a9b2dc8491f36eec8d0cdbce4a923654d43cc0322ba04b83ef11a1: Status 404 returned error can't find the container with id 0424bead60a9b2dc8491f36eec8d0cdbce4a923654d43cc0322ba04b83ef11a1 Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.384181 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:50 crc kubenswrapper[4628]: E1211 05:15:50.384550 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:50.884536481 +0000 UTC m=+53.301883179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.430032 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-76fr7" podStartSLOduration=20.430013584 podStartE2EDuration="20.430013584s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:50.429272585 +0000 UTC m=+52.846619283" watchObservedRunningTime="2025-12-11 05:15:50.430013584 +0000 UTC m=+52.847360282" Dec 11 05:15:50 crc kubenswrapper[4628]: W1211 05:15:50.468967 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcff00c77_bf53_43ce_a5ac_62d5a9264c9b.slice/crio-b534236470b2ef8bd949566ebfb4e185ecdef93e224dac7490381b6b79de7fef WatchSource:0}: Error finding container b534236470b2ef8bd949566ebfb4e185ecdef93e224dac7490381b6b79de7fef: Status 404 returned error can't find the container with id b534236470b2ef8bd949566ebfb4e185ecdef93e224dac7490381b6b79de7fef Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.474117 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cjb5x"] Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.487003 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:50 crc kubenswrapper[4628]: E1211 05:15:50.487349 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:50.987337743 +0000 UTC m=+53.404684441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.490945 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4q5" podStartSLOduration=19.490927319 podStartE2EDuration="19.490927319s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:50.490328463 +0000 UTC m=+52.907675161" watchObservedRunningTime="2025-12-11 05:15:50.490927319 +0000 UTC m=+52.908274017" Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.502687 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d"] Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.609217 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:50 crc kubenswrapper[4628]: E1211 05:15:50.609805 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:51.109789828 +0000 UTC m=+53.527136526 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.610040 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7x6xf"] Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.682041 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qmzs9"] Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.688958 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7kdnp"] Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.710578 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:50 crc kubenswrapper[4628]: E1211 05:15:50.710899 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:51.210886434 +0000 UTC m=+53.628233132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:50 crc kubenswrapper[4628]: W1211 05:15:50.723470 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d7951da_3429_4ac6_b797_208606682d8e.slice/crio-f33b51a12c9a2de4fdcc6904e261ac6f19c36523fa90fc6166468bc8ce16e92a WatchSource:0}: Error finding container f33b51a12c9a2de4fdcc6904e261ac6f19c36523fa90fc6166468bc8ce16e92a: Status 404 returned error can't find the container with id f33b51a12c9a2de4fdcc6904e261ac6f19c36523fa90fc6166468bc8ce16e92a Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.815195 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:50 crc kubenswrapper[4628]: E1211 05:15:50.815858 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:51.315827563 +0000 UTC m=+53.733174261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.828676 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4"] Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.870197 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd"] Dec 11 05:15:50 crc kubenswrapper[4628]: I1211 05:15:50.922060 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:50 crc kubenswrapper[4628]: E1211 05:15:50.922374 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:51.422362845 +0000 UTC m=+53.839709533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:50 crc kubenswrapper[4628]: W1211 05:15:50.947998 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a9a689c_7b60_4ab3_b842_62e7b3dca41c.slice/crio-f849f835d97a043471b7a8e7ea3eb977194f108cebe66c191106b9664d8429af WatchSource:0}: Error finding container f849f835d97a043471b7a8e7ea3eb977194f108cebe66c191106b9664d8429af: Status 404 returned error can't find the container with id f849f835d97a043471b7a8e7ea3eb977194f108cebe66c191106b9664d8429af Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.022819 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.023006 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:51.522977838 +0000 UTC m=+53.940324536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.023298 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.023676 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:51.523658456 +0000 UTC m=+53.941005154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.088371 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk"] Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.089375 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" event={"ID":"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5","Type":"ContainerStarted","Data":"39019208b95c3fb3f940a54894fad7f299642adfc482a5af81870016fdfc23ba"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.093377 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" event={"ID":"12356db6-09ae-438c-a085-6b26ea3b97e8","Type":"ContainerStarted","Data":"895881de6f71044720e421ca9841c3a13f7cdc8a81f8f542b6533046c18fef1d"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.094048 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" event={"ID":"231b7565-2d1b-4c1e-be8e-8f1d1dd3f558","Type":"ContainerStarted","Data":"2e4f1c9bd0c66cad7d49199fbacc4595c325651a45a999b24a6272951f3e2db3"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.094673 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" event={"ID":"ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6","Type":"ContainerStarted","Data":"6888416776d5ce1cdfdf1ad3ea442ca47f2c5d38d44f5742fc0b005c53d99e38"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.106429 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rfts6" event={"ID":"68d7972a-8fde-4878-a758-99ed42b3e4c5","Type":"ContainerStarted","Data":"554fdbbe51849a9a9ddec29c3ccc09724ea8b1b1c42d35c8c7d35efb4d39c324"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.107232 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" event={"ID":"fe8af448-2223-4442-9c1d-2ea4948b0c12","Type":"ContainerStarted","Data":"90bf28b99f75dd0a51d8e74bdc494102b2061ec7e836f7929bd71d052890d8f6"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.107965 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zmqdv" event={"ID":"cff00c77-bf53-43ce-a5ac-62d5a9264c9b","Type":"ContainerStarted","Data":"b534236470b2ef8bd949566ebfb4e185ecdef93e224dac7490381b6b79de7fef"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.127799 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.128436 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:51.628419049 +0000 UTC m=+54.045765747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.137671 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54"] Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.143642 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch"] Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.143825 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" event={"ID":"1e53868f-b961-440d-a046-4ef042fddbbf","Type":"ContainerStarted","Data":"cefed747460a599151a4dc7678208457f47a50318dd7e9f16bf369ca395c6d4c"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.158772 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" event={"ID":"b513f403-38c4-40af-a4b5-13df12b7f807","Type":"ContainerStarted","Data":"ad79c27f165a3ad8e993ce9069329c5c8c0d59a1ccd8527dfe844d2578b1197e"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.163837 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xsldw"] Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.166493 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qjp9m" event={"ID":"b07b0692-4062-4a52-8689-be350bf137cb","Type":"ContainerStarted","Data":"97a68c8bfe136bc2b7493ad4611601f02f520b1752016840f12c41719b983b03"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.169876 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" event={"ID":"ce2358a9-7f38-41e7-ba92-b82e8e98b458","Type":"ContainerStarted","Data":"0424bead60a9b2dc8491f36eec8d0cdbce4a923654d43cc0322ba04b83ef11a1"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.178202 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4nw5h" event={"ID":"5111b417-34a8-405f-a0b8-eab04e144ff8","Type":"ContainerStarted","Data":"f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.183026 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" event={"ID":"3fc85c6c-0509-4b32-b9eb-fe58f5b9306d","Type":"ContainerStarted","Data":"968d7e03c0cd5ebf9a4820a25420626c6af5960d6080f62d6e899e2626ed19f0"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.185070 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7x6xf" event={"ID":"7bde36a3-0865-4d8d-a741-4efc59e3b409","Type":"ContainerStarted","Data":"354d4eb3740b11ad9cb4e2cb970753c20c2a192ec90a5f71001cb7dcf22cccf8"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.186664 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" event={"ID":"12e38faa-255c-42bb-b9c1-faea88d6c989","Type":"ContainerStarted","Data":"e4f4def9a22b53c19c72557160bf608c2b4c0f216c26dab5fa7e76c75f26e725"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.190347 4628 generic.go:334] "Generic (PLEG): container finished" podID="53447f0d-9279-4e5f-a63c-a1b050d24b4b" containerID="a9ca5fdccff0fcc8c5398a165360539ffae1aa10ab7a0055bd273ef92f50ced9" exitCode=0 Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.190674 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" event={"ID":"53447f0d-9279-4e5f-a63c-a1b050d24b4b","Type":"ContainerDied","Data":"a9ca5fdccff0fcc8c5398a165360539ffae1aa10ab7a0055bd273ef92f50ced9"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.193014 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7kdnp" event={"ID":"342f0865-c965-4465-b45c-42b0f84af9e1","Type":"ContainerStarted","Data":"c3dedbb94c64b6eaaeb6c649814ad6b09161541db076b2280db0a5bf31cdb90f"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.193818 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qjp9m" podStartSLOduration=7.193804333 podStartE2EDuration="7.193804333s" podCreationTimestamp="2025-12-11 05:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:51.190385152 +0000 UTC m=+53.607731850" watchObservedRunningTime="2025-12-11 05:15:51.193804333 +0000 UTC m=+53.611151031" Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.198117 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jcfs4"] Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.201563 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8z6mf" event={"ID":"b1dac6ca-2acb-4ec2-bd04-c307aa26c17f","Type":"ContainerStarted","Data":"5a6bce9aaa04a28f2b6713b2500fb2b37309ea9983bdcbb711d5e7b2fdd08d70"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.208706 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" event={"ID":"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb","Type":"ContainerStarted","Data":"d8e1961a7e7644e5ce3bf748c250af80bbbe86b73756cba30cfead0c42fee1e1"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.209798 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.213620 4628 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wg98b container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.213664 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" podUID="a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.220877 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vvw9w" podStartSLOduration=21.220835954000002 podStartE2EDuration="21.220835954s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:51.220609948 +0000 UTC m=+53.637956646" watchObservedRunningTime="2025-12-11 05:15:51.220835954 +0000 UTC m=+53.638182652" Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.227275 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" event={"ID":"8a9a689c-7b60-4ab3-b842-62e7b3dca41c","Type":"ContainerStarted","Data":"f849f835d97a043471b7a8e7ea3eb977194f108cebe66c191106b9664d8429af"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.231165 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.232484 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:51.732470595 +0000 UTC m=+54.149817293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.240447 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" event={"ID":"3da56a92-c008-4f39-825b-baedf4f3195d","Type":"ContainerStarted","Data":"74fc3576e559d50438ab09b6f8c5d347840f2bb41dd657999759f1afd14ffdc5"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.253612 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-4nw5h" podStartSLOduration=21.253588718 podStartE2EDuration="21.253588718s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:51.253386752 +0000 UTC m=+53.670733460" watchObservedRunningTime="2025-12-11 05:15:51.253588718 +0000 UTC m=+53.670935416" Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.282194 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" podStartSLOduration=21.28217077 podStartE2EDuration="21.28217077s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:51.276241412 +0000 UTC m=+53.693588130" watchObservedRunningTime="2025-12-11 05:15:51.28217077 +0000 UTC m=+53.699517468" Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.296747 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf" event={"ID":"e22056a0-8001-488d-9dd7-9368d4a459e8","Type":"ContainerStarted","Data":"ac2cb4e211c01b602f0dac2ae76c765b1a0a0f6e26fefc6c9c7ed18fbd15de56"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.304268 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-54kds" podStartSLOduration=20.304047283 podStartE2EDuration="20.304047283s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:51.302415069 +0000 UTC m=+53.719761777" watchObservedRunningTime="2025-12-11 05:15:51.304047283 +0000 UTC m=+53.721393981" Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.321912 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" event={"ID":"949f77a9-d70d-45ad-8c60-554afb860a62","Type":"ContainerStarted","Data":"a9fb92ae5271ead363d87a6ab7aabcd16dd61831700f98380c3dd810cb95a210"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.331906 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8z6mf" podStartSLOduration=20.331890316 podStartE2EDuration="20.331890316s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:51.329677517 +0000 UTC m=+53.747024205" watchObservedRunningTime="2025-12-11 05:15:51.331890316 +0000 UTC m=+53.749237014" Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.334089 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" event={"ID":"1d7951da-3429-4ac6-b797-208606682d8e","Type":"ContainerStarted","Data":"f33b51a12c9a2de4fdcc6904e261ac6f19c36523fa90fc6166468bc8ce16e92a"} Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.334773 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.335951 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:51.835927803 +0000 UTC m=+54.253274501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.354760 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" event={"ID":"159f3336-509f-41f1-ad07-380009d48dd7","Type":"ContainerStarted","Data":"5d53af9e09d35d709d12cbf7bfc01330daaafa8d9a643a32531144d67f3fd95d"} Dec 11 05:15:51 crc kubenswrapper[4628]: W1211 05:15:51.418533 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod209cebdd_7761_42a6_9bf1_089cc06c3dca.slice/crio-994ae7bde5429dc153d8ed22126ee2ce8f512612a1ebe3a917f4346646ac13c7 WatchSource:0}: Error finding container 994ae7bde5429dc153d8ed22126ee2ce8f512612a1ebe3a917f4346646ac13c7: Status 404 returned error can't find the container with id 994ae7bde5429dc153d8ed22126ee2ce8f512612a1ebe3a917f4346646ac13c7 Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.439080 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mfqxc" podStartSLOduration=21.439065454 podStartE2EDuration="21.439065454s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:51.438262213 +0000 UTC m=+53.855608921" watchObservedRunningTime="2025-12-11 05:15:51.439065454 +0000 UTC m=+53.856412152" Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.441602 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.444480 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:51.944466838 +0000 UTC m=+54.361813536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.543043 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.543200 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.04317888 +0000 UTC m=+54.460525578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.543727 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.544376 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.044365422 +0000 UTC m=+54.461712120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.644659 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.644823 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.144798291 +0000 UTC m=+54.562144989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.644954 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.645454 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.145442208 +0000 UTC m=+54.562788906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.746315 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.747015 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.246999816 +0000 UTC m=+54.664346514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.791903 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.804262 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:15:51 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:15:51 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:15:51 crc kubenswrapper[4628]: healthz check failed Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.804313 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.852635 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.853046 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.353029324 +0000 UTC m=+54.770376022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.956010 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.956291 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.456251697 +0000 UTC m=+54.873598395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:51 crc kubenswrapper[4628]: I1211 05:15:51.956502 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:51 crc kubenswrapper[4628]: E1211 05:15:51.956914 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.456901064 +0000 UTC m=+54.874247762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.060548 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:52 crc kubenswrapper[4628]: E1211 05:15:52.060979 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.560948228 +0000 UTC m=+54.978294926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.172528 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:52 crc kubenswrapper[4628]: E1211 05:15:52.172795 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.672785161 +0000 UTC m=+55.090131859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.273717 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:52 crc kubenswrapper[4628]: E1211 05:15:52.274326 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.774311998 +0000 UTC m=+55.191658696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.376503 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:52 crc kubenswrapper[4628]: E1211 05:15:52.376888 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.876875464 +0000 UTC m=+55.294222162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.477516 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:52 crc kubenswrapper[4628]: E1211 05:15:52.477858 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:52.977824166 +0000 UTC m=+55.395170864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.489862 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" event={"ID":"3fc85c6c-0509-4b32-b9eb-fe58f5b9306d","Type":"ContainerStarted","Data":"3eb9b9cece9dda9d6f753e1369d068820fcb8269d29f98998367bb80339a6996"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.501873 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" event={"ID":"ca721ae1-eaa8-40f7-b8f8-0e77ed8bc0e6","Type":"ContainerStarted","Data":"59a32a58c150e2cb5f3f424d81eb0699939b9a4d3adb17b47bac562c16987192"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.513129 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" event={"ID":"8ef42e00-ffb9-43db-b82e-ee36516674ce","Type":"ContainerStarted","Data":"6e99e6f273c58e4e7739ff96e26f611912c845699094840cde31a5b274b96cb8"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.521517 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbf8w" podStartSLOduration=22.521503111 podStartE2EDuration="22.521503111s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:52.519362554 +0000 UTC m=+54.936709252" watchObservedRunningTime="2025-12-11 05:15:52.521503111 +0000 UTC m=+54.938849809" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.581674 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:52 crc kubenswrapper[4628]: E1211 05:15:52.582259 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:53.082248101 +0000 UTC m=+55.499594799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.591604 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-k4bsx" podStartSLOduration=21.59158325 podStartE2EDuration="21.59158325s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:52.590879531 +0000 UTC m=+55.008226229" watchObservedRunningTime="2025-12-11 05:15:52.59158325 +0000 UTC m=+55.008929958" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.615336 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" event={"ID":"231b7565-2d1b-4c1e-be8e-8f1d1dd3f558","Type":"ContainerStarted","Data":"221b11f060d9322efeede1808ab8b9a21304b077e4afcb20afedf75d521e4e5a"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.639897 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-n4h96" event={"ID":"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76","Type":"ContainerStarted","Data":"f3d140e17197eeacceae94f04232c7877c144df1873bf04c2a2edcf75eae6c31"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.640828 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-qmzs9" podStartSLOduration=21.640818083 podStartE2EDuration="21.640818083s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:52.639979061 +0000 UTC m=+55.057325759" watchObservedRunningTime="2025-12-11 05:15:52.640818083 +0000 UTC m=+55.058164781" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.672512 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-xwptc" podStartSLOduration=21.672494538 podStartE2EDuration="21.672494538s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:52.672032625 +0000 UTC m=+55.089379323" watchObservedRunningTime="2025-12-11 05:15:52.672494538 +0000 UTC m=+55.089841236" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.682922 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:52 crc kubenswrapper[4628]: E1211 05:15:52.684255 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:53.184240781 +0000 UTC m=+55.601587479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.684693 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" event={"ID":"b513f403-38c4-40af-a4b5-13df12b7f807","Type":"ContainerStarted","Data":"7f9e438ffe7867fa98df3dcbf15821c6af7bc5d7e0639bf4a789c9a4b7dfc3c9"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.711544 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" event={"ID":"4d32e4c2-7e36-4cf7-8369-982164414c7b","Type":"ContainerStarted","Data":"9e1a97f6e2cc01ccdea54ae6f989aeff41d4c3590e41e528114eaf3ed346f9c9"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.711587 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" event={"ID":"4d32e4c2-7e36-4cf7-8369-982164414c7b","Type":"ContainerStarted","Data":"e55e65fabeb523b880148de0e9f1e197ccddb75a414ed7fc51c4cb0bc61095c7"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.712064 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.722199 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xsldw" event={"ID":"a73e280a-008c-4e72-8844-375de50d4222","Type":"ContainerStarted","Data":"8d390c4bae67f1f337de517fae76cd3a14e5e99698bb3fd8117a9367c68fccb2"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.734980 4628 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-whb54 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.735045 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" podUID="4d32e4c2-7e36-4cf7-8369-982164414c7b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.748193 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" event={"ID":"fe8af448-2223-4442-9c1d-2ea4948b0c12","Type":"ContainerStarted","Data":"ea4a1c693cb1686fef9ddef937c9178c36b77f783d9ca50e65020339b442c552"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.748457 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.749302 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" podStartSLOduration=21.749287276 podStartE2EDuration="21.749287276s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:52.743114891 +0000 UTC m=+55.160461589" watchObservedRunningTime="2025-12-11 05:15:52.749287276 +0000 UTC m=+55.166633974" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.764037 4628 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-m8br6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.764091 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" podUID="fe8af448-2223-4442-9c1d-2ea4948b0c12" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.766098 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zmqdv" event={"ID":"cff00c77-bf53-43ce-a5ac-62d5a9264c9b","Type":"ContainerStarted","Data":"10ab9e4cc877597ea5ba3b7cc42fa2b9017749b4d2dc5890fca2603e49820cd6"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.770863 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" event={"ID":"159f3336-509f-41f1-ad07-380009d48dd7","Type":"ContainerStarted","Data":"c79615faee9690e1460046d8b7f60a870fb817f3e4a94449f64b6f019aa3e977"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.774512 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" event={"ID":"ce2358a9-7f38-41e7-ba92-b82e8e98b458","Type":"ContainerStarted","Data":"0924ac6e1d31b618731ffe5cff5071db9f2e1b0f1e54a28173a4d10a18fca739"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.775216 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.784325 4628 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6vrgl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.784382 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" podUID="ce2358a9-7f38-41e7-ba92-b82e8e98b458" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.784908 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:52 crc kubenswrapper[4628]: E1211 05:15:52.786702 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:53.286687473 +0000 UTC m=+55.704034171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.797281 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:15:52 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:15:52 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:15:52 crc kubenswrapper[4628]: healthz check failed Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.797471 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.804634 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" podStartSLOduration=21.804615141 podStartE2EDuration="21.804615141s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:52.804206171 +0000 UTC m=+55.221552879" watchObservedRunningTime="2025-12-11 05:15:52.804615141 +0000 UTC m=+55.221961839" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.812710 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" event={"ID":"85e69a46-d878-4004-9a92-c1ccc00000e9","Type":"ContainerStarted","Data":"6ec1ed693bb30e2495141cd6d85a75448a7a5751c4f5ce5f5fc28c85b2742e82"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.833665 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" podStartSLOduration=21.833652545 podStartE2EDuration="21.833652545s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:52.832210467 +0000 UTC m=+55.249557165" watchObservedRunningTime="2025-12-11 05:15:52.833652545 +0000 UTC m=+55.250999243" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.870947 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" event={"ID":"1eefadbf-ac92-4b97-999e-fb262b5d45c2","Type":"ContainerStarted","Data":"28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.871544 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.884498 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-czh2t" podStartSLOduration=21.884484091 podStartE2EDuration="21.884484091s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:52.882299332 +0000 UTC m=+55.299646020" watchObservedRunningTime="2025-12-11 05:15:52.884484091 +0000 UTC m=+55.301830779" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.885552 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:52 crc kubenswrapper[4628]: E1211 05:15:52.887128 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:53.387108671 +0000 UTC m=+55.804455369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.894766 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" event={"ID":"1d7951da-3429-4ac6-b797-208606682d8e","Type":"ContainerStarted","Data":"13b7fa9248089630ddb0fe5c4bc898625c5fa2d9d65846f6ef6fafd6555959c7"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.906550 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" event={"ID":"209cebdd-7761-42a6-9bf1-089cc06c3dca","Type":"ContainerStarted","Data":"994ae7bde5429dc153d8ed22126ee2ce8f512612a1ebe3a917f4346646ac13c7"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.914111 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" podStartSLOduration=21.914098291 podStartE2EDuration="21.914098291s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:52.912434906 +0000 UTC m=+55.329781604" watchObservedRunningTime="2025-12-11 05:15:52.914098291 +0000 UTC m=+55.331444989" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.916424 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" event={"ID":"a2361138-5571-4a9b-8ac9-a0cac66d682a","Type":"ContainerStarted","Data":"08b2a7e34ccb66a6a72ad2a3018a5c85d8d67dfd728663275f455541800f5a9f"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.917792 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" event={"ID":"a11e465b-c6bf-465d-877f-0f09de58c651","Type":"ContainerStarted","Data":"4e401b0bd0a05ceb51e2e903c102495225a4b5a0ce3200f420dc8ae38b4e52ba"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.933037 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" event={"ID":"03b99e5a-1844-4a5f-a449-b4c6d6ca1ae5","Type":"ContainerStarted","Data":"159580f67dd515a32baa0fa70fc385193fea08c1e23f369c66db442a296ddab2"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.935629 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-62v78" event={"ID":"30bc62a3-63c3-4cab-bbdf-b790a500d378","Type":"ContainerStarted","Data":"2010b4f29ab1521031bdc5c55e404b95e8ab760c3505244b09be850741493f2c"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.944711 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jcfs4" event={"ID":"8d35c515-c721-4484-bef7-80e52206f26f","Type":"ContainerStarted","Data":"fa04d94f3eb3ad238a8ca93621630b6c0cf57280a974d0c83d255821a3f0eedd"} Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.958637 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.964705 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7kjhs" podStartSLOduration=21.96468705 podStartE2EDuration="21.96468705s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:52.946629798 +0000 UTC m=+55.363976496" watchObservedRunningTime="2025-12-11 05:15:52.96468705 +0000 UTC m=+55.382033738" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.985038 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h9ckw" podStartSLOduration=21.985025912 podStartE2EDuration="21.985025912s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:52.983125661 +0000 UTC m=+55.400472359" watchObservedRunningTime="2025-12-11 05:15:52.985025912 +0000 UTC m=+55.402372600" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.985400 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" podStartSLOduration=8.985395362 podStartE2EDuration="8.985395362s" podCreationTimestamp="2025-12-11 05:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:52.966862848 +0000 UTC m=+55.384209546" watchObservedRunningTime="2025-12-11 05:15:52.985395362 +0000 UTC m=+55.402742060" Dec 11 05:15:52 crc kubenswrapper[4628]: I1211 05:15:52.992728 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.003726 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.005272 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:53.505249361 +0000 UTC m=+55.922596049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.053551 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-78pgf" podStartSLOduration=22.053535 podStartE2EDuration="22.053535s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:53.053415617 +0000 UTC m=+55.470762315" watchObservedRunningTime="2025-12-11 05:15:53.053535 +0000 UTC m=+55.470881688" Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.095304 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.095624 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:53.595609392 +0000 UTC m=+56.012956090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.149582 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" podStartSLOduration=22.14955572 podStartE2EDuration="22.14955572s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:53.118386329 +0000 UTC m=+55.535733027" watchObservedRunningTime="2025-12-11 05:15:53.14955572 +0000 UTC m=+55.566902408" Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.197034 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.197426 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:53.697411106 +0000 UTC m=+56.114757794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.298294 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.298489 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:53.798462121 +0000 UTC m=+56.215808819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.298760 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.299094 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:53.799081437 +0000 UTC m=+56.216428135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.399602 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.399776 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:53.899752152 +0000 UTC m=+56.317098850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.399929 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.400216 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:53.900205494 +0000 UTC m=+56.317552192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.505035 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.505163 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.005143503 +0000 UTC m=+56.422490201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.505651 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.505957 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.005949554 +0000 UTC m=+56.423296252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.606574 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.606748 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.106729802 +0000 UTC m=+56.524076500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.606824 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.607279 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.107270466 +0000 UTC m=+56.524617164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.708062 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.708251 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.208210678 +0000 UTC m=+56.625557376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.708343 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.708635 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.208627099 +0000 UTC m=+56.625973797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.795207 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:15:53 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:15:53 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:15:53 crc kubenswrapper[4628]: healthz check failed Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.795275 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.809536 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.809766 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.309733297 +0000 UTC m=+56.727080005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.809855 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.810146 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.310135387 +0000 UTC m=+56.727482085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.887803 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-whf6x"] Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.911242 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:53 crc kubenswrapper[4628]: E1211 05:15:53.911579 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.411555042 +0000 UTC m=+56.828901740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.949740 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zmqdv" event={"ID":"cff00c77-bf53-43ce-a5ac-62d5a9264c9b","Type":"ContainerStarted","Data":"05e37190d2940271baac299b79916cec9aae8039c4c434fd4f62643c706e619d"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.953675 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-62v78" event={"ID":"30bc62a3-63c3-4cab-bbdf-b790a500d378","Type":"ContainerStarted","Data":"09612bf6fbc356eb28d3842ab16428df561ae3a35f8901736ffc3e8d15c64ace"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.955294 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xv2pd" event={"ID":"8a9a689c-7b60-4ab3-b842-62e7b3dca41c","Type":"ContainerStarted","Data":"7f7f648999da2bade93657700d9dcde2121f969025b378bbbde57d6d40ac7ca8"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.956663 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" event={"ID":"1d7951da-3429-4ac6-b797-208606682d8e","Type":"ContainerStarted","Data":"919ec74d7a29fada26ec0b6db478c5d8e166cd04dd5559f05300beaa5d3d6437"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.957644 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7kdnp" event={"ID":"342f0865-c965-4465-b45c-42b0f84af9e1","Type":"ContainerStarted","Data":"9c6c007ba2cda7c3663094b4874b5d149ab08e5a9c08ae476ea23c91ebdb1881"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.958902 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" event={"ID":"b513f403-38c4-40af-a4b5-13df12b7f807","Type":"ContainerStarted","Data":"a964e8c84d1e04b37e0c398907ee499bf013fd1deb344a24f5141f9772079348"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.960157 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rfts6" event={"ID":"68d7972a-8fde-4878-a758-99ed42b3e4c5","Type":"ContainerStarted","Data":"ac37a2d9e545bee2a463dfd6f941ebb48344b2516e0d7c33f5783184d5e2b639"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.963711 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" event={"ID":"53447f0d-9279-4e5f-a63c-a1b050d24b4b","Type":"ContainerStarted","Data":"426dd71273511eb09748315efca10afac436d681b5ebb21a29013f097475b320"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.964215 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.966862 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zmqdv" podStartSLOduration=22.966835436 podStartE2EDuration="22.966835436s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:53.964990437 +0000 UTC m=+56.382337135" watchObservedRunningTime="2025-12-11 05:15:53.966835436 +0000 UTC m=+56.384182134" Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.970683 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" event={"ID":"1e53868f-b961-440d-a046-4ef042fddbbf","Type":"ContainerStarted","Data":"859874f2992f37041497e6af29c95e148decc649e3a6a3d60f2250435bfd8960"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.971679 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jcfs4" event={"ID":"8d35c515-c721-4484-bef7-80e52206f26f","Type":"ContainerStarted","Data":"875fa7010e065e4b4bef410cf553bb54cc3bcca671156da59fb918b9b773bc62"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.973073 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" event={"ID":"a11e465b-c6bf-465d-877f-0f09de58c651","Type":"ContainerStarted","Data":"4aab91837cc3d224ca284d16ebf4e7a39a8f8c1e95d2a73ac214dac882d77829"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.973900 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7x6xf" event={"ID":"7bde36a3-0865-4d8d-a741-4efc59e3b409","Type":"ContainerStarted","Data":"a6e8ddd61b9e3ed375cdd8e5895026cec3810273dd5b904fdc3ed048aea0ccdf"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.975480 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-n4h96" event={"ID":"4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76","Type":"ContainerStarted","Data":"869f236fb18026d838832b1477c52101f7b974587170798782b2e90437359efe"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.976396 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" event={"ID":"209cebdd-7761-42a6-9bf1-089cc06c3dca","Type":"ContainerStarted","Data":"ea28b50587a764088b354a86fa7c17a4227fe9490d614e081e53ac7aeb376396"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.979962 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rfts6" podStartSLOduration=22.979945035 podStartE2EDuration="22.979945035s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:53.977438008 +0000 UTC m=+56.394784706" watchObservedRunningTime="2025-12-11 05:15:53.979945035 +0000 UTC m=+56.397291733" Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.982370 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xsldw" event={"ID":"a73e280a-008c-4e72-8844-375de50d4222","Type":"ContainerStarted","Data":"a5f343b63f6571df35add27d1dd9326a9d3e56bb5d6a2806fbdba35ddb114664"} Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.982405 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-xsldw" Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.986808 4628 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6vrgl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.986817 4628 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-whb54 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.986878 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" podUID="ce2358a9-7f38-41e7-ba92-b82e8e98b458" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.986910 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" podUID="4d32e4c2-7e36-4cf7-8369-982164414c7b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Dec 11 05:15:53 crc kubenswrapper[4628]: I1211 05:15:53.994578 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" podStartSLOduration=23.994562495 podStartE2EDuration="23.994562495s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:53.992077839 +0000 UTC m=+56.409424537" watchObservedRunningTime="2025-12-11 05:15:53.994562495 +0000 UTC m=+56.411909193" Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.005980 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jcfs4" podStartSLOduration=10.00596623 podStartE2EDuration="10.00596623s" podCreationTimestamp="2025-12-11 05:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:54.005432675 +0000 UTC m=+56.422779373" watchObservedRunningTime="2025-12-11 05:15:54.00596623 +0000 UTC m=+56.423312928" Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.094069 4628 patch_prober.go:28] interesting pod/downloads-7954f5f757-xsldw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.094453 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xsldw" podUID="a73e280a-008c-4e72-8844-375de50d4222" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.095559 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:54 crc kubenswrapper[4628]: E1211 05:15:54.095861 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.595829016 +0000 UTC m=+57.013175714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.113902 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-xsldw" podStartSLOduration=24.113882627 podStartE2EDuration="24.113882627s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:54.113506907 +0000 UTC m=+56.530853605" watchObservedRunningTime="2025-12-11 05:15:54.113882627 +0000 UTC m=+56.531229325" Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.127884 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m8br6" Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.196454 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:54 crc kubenswrapper[4628]: E1211 05:15:54.196671 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.696614214 +0000 UTC m=+57.113960902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.196785 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:54 crc kubenswrapper[4628]: E1211 05:15:54.197976 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.69796801 +0000 UTC m=+57.115314708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.299203 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:54 crc kubenswrapper[4628]: E1211 05:15:54.299716 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.799701642 +0000 UTC m=+57.217048340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.400492 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:54 crc kubenswrapper[4628]: E1211 05:15:54.400975 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:54.900961824 +0000 UTC m=+57.318308522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.425444 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.455923 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.503031 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:54 crc kubenswrapper[4628]: E1211 05:15:54.503425 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.003407735 +0000 UTC m=+57.420754433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.604960 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:54 crc kubenswrapper[4628]: E1211 05:15:54.605335 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.105322474 +0000 UTC m=+57.522669172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.705876 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:54 crc kubenswrapper[4628]: E1211 05:15:54.706201 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.206185063 +0000 UTC m=+57.623531761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.794745 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:15:54 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:15:54 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:15:54 crc kubenswrapper[4628]: healthz check failed Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.795167 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.806792 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:54 crc kubenswrapper[4628]: E1211 05:15:54.807167 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.307154606 +0000 UTC m=+57.724501304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.908302 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:54 crc kubenswrapper[4628]: E1211 05:15:54.908512 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.408481648 +0000 UTC m=+57.825828346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.908784 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:54 crc kubenswrapper[4628]: E1211 05:15:54.909108 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.409100564 +0000 UTC m=+57.826447262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.986314 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7kdnp" event={"ID":"342f0865-c965-4465-b45c-42b0f84af9e1","Type":"ContainerStarted","Data":"2d1345f35ab95d5d16747e1e0e905aaf4e0ded2a598555efc05a129a2e63aded"} Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.986660 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-7kdnp" Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.987223 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" event={"ID":"12356db6-09ae-438c-a085-6b26ea3b97e8","Type":"ContainerStarted","Data":"87447fe918f3b0379d44cac4ccbb226b4fc0bda7373996dc26eca04373294d43"} Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.988758 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7x6xf" event={"ID":"7bde36a3-0865-4d8d-a741-4efc59e3b409","Type":"ContainerStarted","Data":"80e25e2fbbd876fe32fe4632a75565f8df85197305e22d37ab5f09a336166a45"} Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.988895 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" podUID="1eefadbf-ac92-4b97-999e-fb262b5d45c2" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" gracePeriod=30 Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.989560 4628 patch_prober.go:28] interesting pod/downloads-7954f5f757-xsldw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.989657 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xsldw" podUID="a73e280a-008c-4e72-8844-375de50d4222" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.989692 4628 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6vrgl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Dec 11 05:15:54 crc kubenswrapper[4628]: I1211 05:15:54.989807 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" podUID="ce2358a9-7f38-41e7-ba92-b82e8e98b458" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.010295 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:55 crc kubenswrapper[4628]: E1211 05:15:55.010763 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.510747465 +0000 UTC m=+57.928094163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.015102 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.015084771 podStartE2EDuration="1.015084771s" podCreationTimestamp="2025-12-11 05:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:55.012170823 +0000 UTC m=+57.429517531" watchObservedRunningTime="2025-12-11 05:15:55.015084771 +0000 UTC m=+57.432431469" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.063120 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7kdnp" podStartSLOduration=11.063104882 podStartE2EDuration="11.063104882s" podCreationTimestamp="2025-12-11 05:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:55.061127879 +0000 UTC m=+57.478474577" watchObservedRunningTime="2025-12-11 05:15:55.063104882 +0000 UTC m=+57.480451580" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.083402 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" podStartSLOduration=25.083381862 podStartE2EDuration="25.083381862s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:55.080355191 +0000 UTC m=+57.497701889" watchObservedRunningTime="2025-12-11 05:15:55.083381862 +0000 UTC m=+57.500728560" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.112173 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:55 crc kubenswrapper[4628]: E1211 05:15:55.117431 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.61741655 +0000 UTC m=+58.034763238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.136706 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-n4h96" podStartSLOduration=25.136686564 podStartE2EDuration="25.136686564s" podCreationTimestamp="2025-12-11 05:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:55.118095248 +0000 UTC m=+57.535441946" watchObservedRunningTime="2025-12-11 05:15:55.136686564 +0000 UTC m=+57.554033262" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.180724 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mhkq4" podStartSLOduration=24.180709268 podStartE2EDuration="24.180709268s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:55.17104253 +0000 UTC m=+57.588389228" watchObservedRunningTime="2025-12-11 05:15:55.180709268 +0000 UTC m=+57.598055966" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.182898 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" podStartSLOduration=24.182890916 podStartE2EDuration="24.182890916s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:55.137948958 +0000 UTC m=+57.555295646" watchObservedRunningTime="2025-12-11 05:15:55.182890916 +0000 UTC m=+57.600237614" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.212938 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:55 crc kubenswrapper[4628]: E1211 05:15:55.213207 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.713188794 +0000 UTC m=+58.130535492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.305021 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" podStartSLOduration=24.304990342 podStartE2EDuration="24.304990342s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:55.30340721 +0000 UTC m=+57.720753908" watchObservedRunningTime="2025-12-11 05:15:55.304990342 +0000 UTC m=+57.722337040" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.306335 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-2dgsz" podStartSLOduration=24.306327668 podStartE2EDuration="24.306327668s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:55.214051417 +0000 UTC m=+57.631398115" watchObservedRunningTime="2025-12-11 05:15:55.306327668 +0000 UTC m=+57.723674366" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.313717 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:55 crc kubenswrapper[4628]: E1211 05:15:55.314092 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.814080415 +0000 UTC m=+58.231427113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.364874 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-62v78" podStartSLOduration=24.364858759 podStartE2EDuration="24.364858759s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:55.359423024 +0000 UTC m=+57.776769722" watchObservedRunningTime="2025-12-11 05:15:55.364858759 +0000 UTC m=+57.782205457" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.414581 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:55 crc kubenswrapper[4628]: E1211 05:15:55.414742 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:55.914716568 +0000 UTC m=+58.332063256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.516149 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:55 crc kubenswrapper[4628]: E1211 05:15:55.516541 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.016515803 +0000 UTC m=+58.433862501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.605123 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7m4g7"] Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.606032 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.612547 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.616550 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.616636 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5f88ec-256a-4556-b06b-814dfa23c87b-catalog-content\") pod \"certified-operators-7m4g7\" (UID: \"3d5f88ec-256a-4556-b06b-814dfa23c87b\") " pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.616657 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9swxk\" (UniqueName: \"kubernetes.io/projected/3d5f88ec-256a-4556-b06b-814dfa23c87b-kube-api-access-9swxk\") pod \"certified-operators-7m4g7\" (UID: \"3d5f88ec-256a-4556-b06b-814dfa23c87b\") " pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.616686 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5f88ec-256a-4556-b06b-814dfa23c87b-utilities\") pod \"certified-operators-7m4g7\" (UID: \"3d5f88ec-256a-4556-b06b-814dfa23c87b\") " pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:15:55 crc kubenswrapper[4628]: E1211 05:15:55.616825 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.116812058 +0000 UTC m=+58.534158756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.632535 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7m4g7"] Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.717707 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5f88ec-256a-4556-b06b-814dfa23c87b-utilities\") pod \"certified-operators-7m4g7\" (UID: \"3d5f88ec-256a-4556-b06b-814dfa23c87b\") " pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.717760 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.717834 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5f88ec-256a-4556-b06b-814dfa23c87b-catalog-content\") pod \"certified-operators-7m4g7\" (UID: \"3d5f88ec-256a-4556-b06b-814dfa23c87b\") " pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.717864 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9swxk\" (UniqueName: \"kubernetes.io/projected/3d5f88ec-256a-4556-b06b-814dfa23c87b-kube-api-access-9swxk\") pod \"certified-operators-7m4g7\" (UID: \"3d5f88ec-256a-4556-b06b-814dfa23c87b\") " pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:15:55 crc kubenswrapper[4628]: E1211 05:15:55.718202 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.218189552 +0000 UTC m=+58.635536250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.718475 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5f88ec-256a-4556-b06b-814dfa23c87b-utilities\") pod \"certified-operators-7m4g7\" (UID: \"3d5f88ec-256a-4556-b06b-814dfa23c87b\") " pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.718542 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5f88ec-256a-4556-b06b-814dfa23c87b-catalog-content\") pod \"certified-operators-7m4g7\" (UID: \"3d5f88ec-256a-4556-b06b-814dfa23c87b\") " pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.744599 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9swxk\" (UniqueName: \"kubernetes.io/projected/3d5f88ec-256a-4556-b06b-814dfa23c87b-kube-api-access-9swxk\") pod \"certified-operators-7m4g7\" (UID: \"3d5f88ec-256a-4556-b06b-814dfa23c87b\") " pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.795313 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:15:55 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:15:55 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:15:55 crc kubenswrapper[4628]: healthz check failed Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.795724 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.819191 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:55 crc kubenswrapper[4628]: E1211 05:15:55.819353 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.319324479 +0000 UTC m=+58.736671177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.819452 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:55 crc kubenswrapper[4628]: E1211 05:15:55.819940 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.319922284 +0000 UTC m=+58.737268983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.919595 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.920036 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:55 crc kubenswrapper[4628]: E1211 05:15:55.920194 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.420157398 +0000 UTC m=+58.837504096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.920418 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:55 crc kubenswrapper[4628]: E1211 05:15:55.920710 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.420698762 +0000 UTC m=+58.838045460 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.968557 4628 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-wwzr5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.968616 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" podUID="53447f0d-9279-4e5f-a63c-a1b050d24b4b" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.968615 4628 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-wwzr5 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.968708 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" podUID="53447f0d-9279-4e5f-a63c-a1b050d24b4b" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.992827 4628 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-wwzr5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.992907 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" podUID="53447f0d-9279-4e5f-a63c-a1b050d24b4b" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.994895 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dcx8h"] Dec 11 05:15:55 crc kubenswrapper[4628]: I1211 05:15:55.995970 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.013587 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dcx8h"] Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.022193 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.022375 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.022433 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.022459 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.022494 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.030842 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.530814238 +0000 UTC m=+58.948160936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.031728 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.043506 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.043676 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.126133 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a27ccd53-7ae0-4c98-9461-b545f841ea79-utilities\") pod \"certified-operators-dcx8h\" (UID: \"a27ccd53-7ae0-4c98-9461-b545f841ea79\") " pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.126213 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.126293 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a27ccd53-7ae0-4c98-9461-b545f841ea79-catalog-content\") pod \"certified-operators-dcx8h\" (UID: \"a27ccd53-7ae0-4c98-9461-b545f841ea79\") " pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.126319 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m597z\" (UniqueName: \"kubernetes.io/projected/a27ccd53-7ae0-4c98-9461-b545f841ea79-kube-api-access-m597z\") pod \"certified-operators-dcx8h\" (UID: \"a27ccd53-7ae0-4c98-9461-b545f841ea79\") " pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.127259 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.627241571 +0000 UTC m=+59.044588269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.127767 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.134053 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.153066 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7m4g7"] Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.226796 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.226958 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.726933679 +0000 UTC m=+59.144280377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.227056 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a27ccd53-7ae0-4c98-9461-b545f841ea79-utilities\") pod \"certified-operators-dcx8h\" (UID: \"a27ccd53-7ae0-4c98-9461-b545f841ea79\") " pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.227120 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.227160 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a27ccd53-7ae0-4c98-9461-b545f841ea79-catalog-content\") pod \"certified-operators-dcx8h\" (UID: \"a27ccd53-7ae0-4c98-9461-b545f841ea79\") " pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.227182 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m597z\" (UniqueName: \"kubernetes.io/projected/a27ccd53-7ae0-4c98-9461-b545f841ea79-kube-api-access-m597z\") pod \"certified-operators-dcx8h\" (UID: \"a27ccd53-7ae0-4c98-9461-b545f841ea79\") " pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.227431 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a27ccd53-7ae0-4c98-9461-b545f841ea79-utilities\") pod \"certified-operators-dcx8h\" (UID: \"a27ccd53-7ae0-4c98-9461-b545f841ea79\") " pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.227469 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.727452733 +0000 UTC m=+59.144799431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.227679 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a27ccd53-7ae0-4c98-9461-b545f841ea79-catalog-content\") pod \"certified-operators-dcx8h\" (UID: \"a27ccd53-7ae0-4c98-9461-b545f841ea79\") " pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.272823 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.273949 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.329367 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.330471 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.830439619 +0000 UTC m=+59.247786317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.381714 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.381781 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.397028 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.408123 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hjbgr"] Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.409035 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.414588 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.416051 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hjbgr"] Dec 11 05:15:56 crc kubenswrapper[4628]: W1211 05:15:56.419006 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-8494243dc85eba647532f204b97aa51cdaa79c5f3887ca1a8ad4cadb22f54290 WatchSource:0}: Error finding container 8494243dc85eba647532f204b97aa51cdaa79c5f3887ca1a8ad4cadb22f54290: Status 404 returned error can't find the container with id 8494243dc85eba647532f204b97aa51cdaa79c5f3887ca1a8ad4cadb22f54290 Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.435092 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.436677 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:56.936664192 +0000 UTC m=+59.354010890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.536167 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.536480 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:57.036450743 +0000 UTC m=+59.453797441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.536949 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445e77bd-611f-486b-af50-16e4476e29e4-utilities\") pod \"community-operators-hjbgr\" (UID: \"445e77bd-611f-486b-af50-16e4476e29e4\") " pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.537061 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf68p\" (UniqueName: \"kubernetes.io/projected/445e77bd-611f-486b-af50-16e4476e29e4-kube-api-access-cf68p\") pod \"community-operators-hjbgr\" (UID: \"445e77bd-611f-486b-af50-16e4476e29e4\") " pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.537153 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.537297 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445e77bd-611f-486b-af50-16e4476e29e4-catalog-content\") pod \"community-operators-hjbgr\" (UID: \"445e77bd-611f-486b-af50-16e4476e29e4\") " pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.538786 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:57.038777626 +0000 UTC m=+59.456124314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.594926 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k7hq5"] Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.596426 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.617934 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k7hq5"] Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.638637 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.638929 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445e77bd-611f-486b-af50-16e4476e29e4-catalog-content\") pod \"community-operators-hjbgr\" (UID: \"445e77bd-611f-486b-af50-16e4476e29e4\") " pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.639030 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445e77bd-611f-486b-af50-16e4476e29e4-utilities\") pod \"community-operators-hjbgr\" (UID: \"445e77bd-611f-486b-af50-16e4476e29e4\") " pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.639815 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:57.139783669 +0000 UTC m=+59.557130367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.640438 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445e77bd-611f-486b-af50-16e4476e29e4-catalog-content\") pod \"community-operators-hjbgr\" (UID: \"445e77bd-611f-486b-af50-16e4476e29e4\") " pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.640580 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445e77bd-611f-486b-af50-16e4476e29e4-utilities\") pod \"community-operators-hjbgr\" (UID: \"445e77bd-611f-486b-af50-16e4476e29e4\") " pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.640300 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf68p\" (UniqueName: \"kubernetes.io/projected/445e77bd-611f-486b-af50-16e4476e29e4-kube-api-access-cf68p\") pod \"community-operators-hjbgr\" (UID: \"445e77bd-611f-486b-af50-16e4476e29e4\") " pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.641246 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.641759 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:57.141740841 +0000 UTC m=+59.559087539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.656088 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf68p\" (UniqueName: \"kubernetes.io/projected/445e77bd-611f-486b-af50-16e4476e29e4-kube-api-access-cf68p\") pod \"community-operators-hjbgr\" (UID: \"445e77bd-611f-486b-af50-16e4476e29e4\") " pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.726193 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.743556 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.743976 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b719f066-3997-4c70-bfb1-b489c56e2ef4-utilities\") pod \"community-operators-k7hq5\" (UID: \"b719f066-3997-4c70-bfb1-b489c56e2ef4\") " pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.744072 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hqpf\" (UniqueName: \"kubernetes.io/projected/b719f066-3997-4c70-bfb1-b489c56e2ef4-kube-api-access-8hqpf\") pod \"community-operators-k7hq5\" (UID: \"b719f066-3997-4c70-bfb1-b489c56e2ef4\") " pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.744235 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b719f066-3997-4c70-bfb1-b489c56e2ef4-catalog-content\") pod \"community-operators-k7hq5\" (UID: \"b719f066-3997-4c70-bfb1-b489c56e2ef4\") " pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.744419 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:57.244396589 +0000 UTC m=+59.661743287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.795129 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:15:56 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:15:56 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:15:56 crc kubenswrapper[4628]: healthz check failed Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.795652 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.845941 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hqpf\" (UniqueName: \"kubernetes.io/projected/b719f066-3997-4c70-bfb1-b489c56e2ef4-kube-api-access-8hqpf\") pod \"community-operators-k7hq5\" (UID: \"b719f066-3997-4c70-bfb1-b489c56e2ef4\") " pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.846394 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.846429 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b719f066-3997-4c70-bfb1-b489c56e2ef4-catalog-content\") pod \"community-operators-k7hq5\" (UID: \"b719f066-3997-4c70-bfb1-b489c56e2ef4\") " pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.846451 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b719f066-3997-4c70-bfb1-b489c56e2ef4-utilities\") pod \"community-operators-k7hq5\" (UID: \"b719f066-3997-4c70-bfb1-b489c56e2ef4\") " pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.846865 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b719f066-3997-4c70-bfb1-b489c56e2ef4-utilities\") pod \"community-operators-k7hq5\" (UID: \"b719f066-3997-4c70-bfb1-b489c56e2ef4\") " pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.847117 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:57.347106068 +0000 UTC m=+59.764452766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.847486 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b719f066-3997-4c70-bfb1-b489c56e2ef4-catalog-content\") pod \"community-operators-k7hq5\" (UID: \"b719f066-3997-4c70-bfb1-b489c56e2ef4\") " pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.884739 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hqpf\" (UniqueName: \"kubernetes.io/projected/b719f066-3997-4c70-bfb1-b489c56e2ef4-kube-api-access-8hqpf\") pod \"community-operators-k7hq5\" (UID: \"b719f066-3997-4c70-bfb1-b489c56e2ef4\") " pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.909046 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.927634 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hjbgr"] Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.949123 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.949510 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:57.449477718 +0000 UTC m=+59.866824426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.949753 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:56 crc kubenswrapper[4628]: E1211 05:15:56.950272 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:57.450254239 +0000 UTC m=+59.867600937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:56 crc kubenswrapper[4628]: W1211 05:15:56.970003 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod445e77bd_611f_486b_af50_16e4476e29e4.slice/crio-3615ecc86be0049d3d99e018ac06758193dedb0ce7d1246c4f8d167a546aa1ba WatchSource:0}: Error finding container 3615ecc86be0049d3d99e018ac06758193dedb0ce7d1246c4f8d167a546aa1ba: Status 404 returned error can't find the container with id 3615ecc86be0049d3d99e018ac06758193dedb0ce7d1246c4f8d167a546aa1ba Dec 11 05:15:56 crc kubenswrapper[4628]: I1211 05:15:56.998541 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e700c694bd090cc40a0f9619b54ad409f131e05e47910bd16f7eec28c15adcb5"} Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.000116 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjbgr" event={"ID":"445e77bd-611f-486b-af50-16e4476e29e4","Type":"ContainerStarted","Data":"3615ecc86be0049d3d99e018ac06758193dedb0ce7d1246c4f8d167a546aa1ba"} Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.000987 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8494243dc85eba647532f204b97aa51cdaa79c5f3887ca1a8ad4cadb22f54290"} Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.002880 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m4g7" event={"ID":"3d5f88ec-256a-4556-b06b-814dfa23c87b","Type":"ContainerStarted","Data":"571364e39ca5d07d374fde8e1cef9a3d8f1cf944268631788e85fdc9c34f072b"} Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.009148 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42zdb" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.052348 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:57 crc kubenswrapper[4628]: E1211 05:15:57.052639 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:57.552624479 +0000 UTC m=+59.969971167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.238194 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k7hq5"] Dec 11 05:15:57 crc kubenswrapper[4628]: W1211 05:15:57.243586 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb719f066_3997_4c70_bfb1_b489c56e2ef4.slice/crio-87beb35fd22927e3d7f10bdba4d317a63288b0660668dc61d5ce635a9c5adbfe WatchSource:0}: Error finding container 87beb35fd22927e3d7f10bdba4d317a63288b0660668dc61d5ce635a9c5adbfe: Status 404 returned error can't find the container with id 87beb35fd22927e3d7f10bdba4d317a63288b0660668dc61d5ce635a9c5adbfe Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.285561 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.285794 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.286982 4628 patch_prober.go:28] interesting pod/console-f9d7485db-4nw5h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.287026 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4nw5h" podUID="5111b417-34a8-405f-a0b8-eab04e144ff8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.295009 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.295575 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m597z\" (UniqueName: \"kubernetes.io/projected/a27ccd53-7ae0-4c98-9461-b545f841ea79-kube-api-access-m597z\") pod \"certified-operators-dcx8h\" (UID: \"a27ccd53-7ae0-4c98-9461-b545f841ea79\") " pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.295766 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:57 crc kubenswrapper[4628]: E1211 05:15:57.297753 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:57.797737266 +0000 UTC m=+60.215083964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.351376 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.404550 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:57 crc kubenswrapper[4628]: E1211 05:15:57.405786 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:57.905769126 +0000 UTC m=+60.323115824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:57 crc kubenswrapper[4628]: E1211 05:15:57.507368 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:58.007268393 +0000 UTC m=+60.424615091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.506189 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.509151 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.608956 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:57 crc kubenswrapper[4628]: E1211 05:15:57.609646 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:58.109631093 +0000 UTC m=+60.526977791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.711208 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:57 crc kubenswrapper[4628]: E1211 05:15:57.711755 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:58.211733766 +0000 UTC m=+60.629080464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.791216 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.801639 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:15:57 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:15:57 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:15:57 crc kubenswrapper[4628]: healthz check failed Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.801695 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.813068 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:57 crc kubenswrapper[4628]: E1211 05:15:57.813470 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:58.313448759 +0000 UTC m=+60.730795457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.844778 4628 patch_prober.go:28] interesting pod/downloads-7954f5f757-xsldw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.845216 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xsldw" podUID="a73e280a-008c-4e72-8844-375de50d4222" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.862604 4628 patch_prober.go:28] interesting pod/downloads-7954f5f757-xsldw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.862666 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xsldw" podUID="a73e280a-008c-4e72-8844-375de50d4222" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.880479 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:15:57 crc kubenswrapper[4628]: E1211 05:15:57.900986 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:15:57 crc kubenswrapper[4628]: I1211 05:15:57.918127 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:57 crc kubenswrapper[4628]: E1211 05:15:57.920596 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:58.420584186 +0000 UTC m=+60.837930884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:57 crc kubenswrapper[4628]: E1211 05:15:57.926696 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:15:57 crc kubenswrapper[4628]: E1211 05:15:57.980901 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:15:57 crc kubenswrapper[4628]: E1211 05:15:57.987640 4628 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" podUID="1eefadbf-ac92-4b97-999e-fb262b5d45c2" containerName="kube-multus-additional-cni-plugins" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.022452 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:58 crc kubenswrapper[4628]: E1211 05:15:58.022761 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:58.52274485 +0000 UTC m=+60.940091548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.079086 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"83faab7d0f940e85cfca0d5559dbae7b1acb97912c9018f21585bea9cae2944c"} Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.079718 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.123538 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.124597 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:58 crc kubenswrapper[4628]: E1211 05:15:58.125437 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:58.625423599 +0000 UTC m=+61.042770297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.127480 4628 generic.go:334] "Generic (PLEG): container finished" podID="3d5f88ec-256a-4556-b06b-814dfa23c87b" containerID="bf54f156f958230768e0d40884ea3ba73cdb572e1bc60abe93af6ca18feec130" exitCode=0 Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.127588 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m4g7" event={"ID":"3d5f88ec-256a-4556-b06b-814dfa23c87b","Type":"ContainerDied","Data":"bf54f156f958230768e0d40884ea3ba73cdb572e1bc60abe93af6ca18feec130"} Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.167154 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.188476 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.206158 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"360f9be42c3ea507a823924b5caf00e54205466d6f91d401ac59167a34be3bae"} Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.232420 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:58 crc kubenswrapper[4628]: E1211 05:15:58.233666 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:58.733651345 +0000 UTC m=+61.150998043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.238897 4628 generic.go:334] "Generic (PLEG): container finished" podID="445e77bd-611f-486b-af50-16e4476e29e4" containerID="2ea49cc63a78661f88faba3e5633077de8433dd05e4991bb1f381eefebdb8ff5" exitCode=0 Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.243010 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjbgr" event={"ID":"445e77bd-611f-486b-af50-16e4476e29e4","Type":"ContainerDied","Data":"2ea49cc63a78661f88faba3e5633077de8433dd05e4991bb1f381eefebdb8ff5"} Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.269344 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lbkqk" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.292970 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-whb54" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.298989 4628 generic.go:334] "Generic (PLEG): container finished" podID="b719f066-3997-4c70-bfb1-b489c56e2ef4" containerID="506713efa9deab099dc4bb44882b34dc255fd7432a4cc46b0ca2084e2c8c8c15" exitCode=0 Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.300131 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7hq5" event={"ID":"b719f066-3997-4c70-bfb1-b489c56e2ef4","Type":"ContainerDied","Data":"506713efa9deab099dc4bb44882b34dc255fd7432a4cc46b0ca2084e2c8c8c15"} Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.300156 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7hq5" event={"ID":"b719f066-3997-4c70-bfb1-b489c56e2ef4","Type":"ContainerStarted","Data":"87beb35fd22927e3d7f10bdba4d317a63288b0660668dc61d5ce635a9c5adbfe"} Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.336085 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:58 crc kubenswrapper[4628]: E1211 05:15:58.339826 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:58.839811166 +0000 UTC m=+61.257157864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.436881 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:58 crc kubenswrapper[4628]: E1211 05:15:58.437130 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:58.93708633 +0000 UTC m=+61.354433028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.437178 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rbrk4"] Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.437396 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:58 crc kubenswrapper[4628]: E1211 05:15:58.437904 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:58.937886192 +0000 UTC m=+61.355232890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.493506 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.497113 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.539172 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:58 crc kubenswrapper[4628]: E1211 05:15:58.539661 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.039642315 +0000 UTC m=+61.456989013 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.549594 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbrk4"] Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.627294 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.629247 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.633671 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.641156 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.641491 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeedd8fe-1e9a-4009-a385-07d72fed1277-catalog-content\") pod \"redhat-marketplace-rbrk4\" (UID: \"eeedd8fe-1e9a-4009-a385-07d72fed1277\") " pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.641540 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgnh4\" (UniqueName: \"kubernetes.io/projected/eeedd8fe-1e9a-4009-a385-07d72fed1277-kube-api-access-cgnh4\") pod \"redhat-marketplace-rbrk4\" (UID: \"eeedd8fe-1e9a-4009-a385-07d72fed1277\") " pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.641588 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.641629 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeedd8fe-1e9a-4009-a385-07d72fed1277-utilities\") pod \"redhat-marketplace-rbrk4\" (UID: \"eeedd8fe-1e9a-4009-a385-07d72fed1277\") " pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:15:58 crc kubenswrapper[4628]: E1211 05:15:58.642413 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.142389356 +0000 UTC m=+61.559736054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.654047 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.670233 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7x6xf" podStartSLOduration=27.670206748 podStartE2EDuration="27.670206748s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:15:58.553236248 +0000 UTC m=+60.970582936" watchObservedRunningTime="2025-12-11 05:15:58.670206748 +0000 UTC m=+61.087553446" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.736903 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dcx8h"] Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.744365 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.744629 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ece60403-400b-4da3-ab9c-b2030b94e0bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ece60403-400b-4da3-ab9c-b2030b94e0bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.744664 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeedd8fe-1e9a-4009-a385-07d72fed1277-catalog-content\") pod \"redhat-marketplace-rbrk4\" (UID: \"eeedd8fe-1e9a-4009-a385-07d72fed1277\") " pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.744686 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgnh4\" (UniqueName: \"kubernetes.io/projected/eeedd8fe-1e9a-4009-a385-07d72fed1277-kube-api-access-cgnh4\") pod \"redhat-marketplace-rbrk4\" (UID: \"eeedd8fe-1e9a-4009-a385-07d72fed1277\") " pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.744720 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ece60403-400b-4da3-ab9c-b2030b94e0bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ece60403-400b-4da3-ab9c-b2030b94e0bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.744751 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeedd8fe-1e9a-4009-a385-07d72fed1277-utilities\") pod \"redhat-marketplace-rbrk4\" (UID: \"eeedd8fe-1e9a-4009-a385-07d72fed1277\") " pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:15:58 crc kubenswrapper[4628]: E1211 05:15:58.749569 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.249524743 +0000 UTC m=+61.666871441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.750552 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeedd8fe-1e9a-4009-a385-07d72fed1277-catalog-content\") pod \"redhat-marketplace-rbrk4\" (UID: \"eeedd8fe-1e9a-4009-a385-07d72fed1277\") " pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.772201 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeedd8fe-1e9a-4009-a385-07d72fed1277-utilities\") pod \"redhat-marketplace-rbrk4\" (UID: \"eeedd8fe-1e9a-4009-a385-07d72fed1277\") " pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.811056 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:15:58 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:15:58 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:15:58 crc kubenswrapper[4628]: healthz check failed Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.811111 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.820969 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgnh4\" (UniqueName: \"kubernetes.io/projected/eeedd8fe-1e9a-4009-a385-07d72fed1277-kube-api-access-cgnh4\") pod \"redhat-marketplace-rbrk4\" (UID: \"eeedd8fe-1e9a-4009-a385-07d72fed1277\") " pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.848577 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ece60403-400b-4da3-ab9c-b2030b94e0bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ece60403-400b-4da3-ab9c-b2030b94e0bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.848644 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.848663 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ece60403-400b-4da3-ab9c-b2030b94e0bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ece60403-400b-4da3-ab9c-b2030b94e0bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.848744 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ece60403-400b-4da3-ab9c-b2030b94e0bf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ece60403-400b-4da3-ab9c-b2030b94e0bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 05:15:58 crc kubenswrapper[4628]: E1211 05:15:58.849223 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.349212261 +0000 UTC m=+61.766558959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.850282 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.861318 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nrc6p"] Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.862429 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.918350 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ece60403-400b-4da3-ab9c-b2030b94e0bf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ece60403-400b-4da3-ab9c-b2030b94e0bf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.949613 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:58 crc kubenswrapper[4628]: E1211 05:15:58.949814 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.449782943 +0000 UTC m=+61.867129641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.949909 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:58 crc kubenswrapper[4628]: E1211 05:15:58.950259 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.450251886 +0000 UTC m=+61.867598584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.962142 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nrc6p"] Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.969231 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 05:15:58 crc kubenswrapper[4628]: I1211 05:15:58.990591 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wwzr5" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.050813 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.050981 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.550955271 +0000 UTC m=+61.968301969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.051037 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.051086 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3953bb62-252a-4109-9ed9-f14294565e1d-catalog-content\") pod \"redhat-marketplace-nrc6p\" (UID: \"3953bb62-252a-4109-9ed9-f14294565e1d\") " pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.051159 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh82m\" (UniqueName: \"kubernetes.io/projected/3953bb62-252a-4109-9ed9-f14294565e1d-kube-api-access-bh82m\") pod \"redhat-marketplace-nrc6p\" (UID: \"3953bb62-252a-4109-9ed9-f14294565e1d\") " pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.051228 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3953bb62-252a-4109-9ed9-f14294565e1d-utilities\") pod \"redhat-marketplace-nrc6p\" (UID: \"3953bb62-252a-4109-9ed9-f14294565e1d\") " pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.051339 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.551331452 +0000 UTC m=+61.968678150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:59 crc kubenswrapper[4628]: W1211 05:15:59.114825 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda27ccd53_7ae0_4c98_9461_b545f841ea79.slice/crio-822281fc7c659db96eff7124c81359484a44641b3f919d676ff743dee7430497 WatchSource:0}: Error finding container 822281fc7c659db96eff7124c81359484a44641b3f919d676ff743dee7430497: Status 404 returned error can't find the container with id 822281fc7c659db96eff7124c81359484a44641b3f919d676ff743dee7430497 Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.154316 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.154572 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3953bb62-252a-4109-9ed9-f14294565e1d-catalog-content\") pod \"redhat-marketplace-nrc6p\" (UID: \"3953bb62-252a-4109-9ed9-f14294565e1d\") " pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.154614 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh82m\" (UniqueName: \"kubernetes.io/projected/3953bb62-252a-4109-9ed9-f14294565e1d-kube-api-access-bh82m\") pod \"redhat-marketplace-nrc6p\" (UID: \"3953bb62-252a-4109-9ed9-f14294565e1d\") " pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.154646 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3953bb62-252a-4109-9ed9-f14294565e1d-utilities\") pod \"redhat-marketplace-nrc6p\" (UID: \"3953bb62-252a-4109-9ed9-f14294565e1d\") " pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.155458 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.655443678 +0000 UTC m=+62.072790376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.156055 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3953bb62-252a-4109-9ed9-f14294565e1d-utilities\") pod \"redhat-marketplace-nrc6p\" (UID: \"3953bb62-252a-4109-9ed9-f14294565e1d\") " pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.156147 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3953bb62-252a-4109-9ed9-f14294565e1d-catalog-content\") pod \"redhat-marketplace-nrc6p\" (UID: \"3953bb62-252a-4109-9ed9-f14294565e1d\") " pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.231020 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh82m\" (UniqueName: \"kubernetes.io/projected/3953bb62-252a-4109-9ed9-f14294565e1d-kube-api-access-bh82m\") pod \"redhat-marketplace-nrc6p\" (UID: \"3953bb62-252a-4109-9ed9-f14294565e1d\") " pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.246838 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mjcbm"] Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.247966 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.257023 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.257490 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.757477639 +0000 UTC m=+62.174824337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.270657 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.319975 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mjcbm"] Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.342445 4628 patch_prober.go:28] interesting pod/apiserver-76f77b778f-n4h96 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 11 05:15:59 crc kubenswrapper[4628]: [+]log ok Dec 11 05:15:59 crc kubenswrapper[4628]: [+]etcd ok Dec 11 05:15:59 crc kubenswrapper[4628]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 11 05:15:59 crc kubenswrapper[4628]: [+]poststarthook/generic-apiserver-start-informers ok Dec 11 05:15:59 crc kubenswrapper[4628]: [+]poststarthook/max-in-flight-filter ok Dec 11 05:15:59 crc kubenswrapper[4628]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 11 05:15:59 crc kubenswrapper[4628]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 11 05:15:59 crc kubenswrapper[4628]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 11 05:15:59 crc kubenswrapper[4628]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 11 05:15:59 crc kubenswrapper[4628]: [+]poststarthook/project.openshift.io-projectcache ok Dec 11 05:15:59 crc kubenswrapper[4628]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 11 05:15:59 crc kubenswrapper[4628]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Dec 11 05:15:59 crc kubenswrapper[4628]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 11 05:15:59 crc kubenswrapper[4628]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 11 05:15:59 crc kubenswrapper[4628]: livez check failed Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.342505 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-n4h96" podUID="4d1bb3a2-2abe-44ed-98a3-bf740b1a9f76" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.363627 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.368316 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.868294904 +0000 UTC m=+62.285641602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.375195 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2h92\" (UniqueName: \"kubernetes.io/projected/2a9eb6ef-92ff-415b-a526-26711b88985f-kube-api-access-r2h92\") pod \"redhat-operators-mjcbm\" (UID: \"2a9eb6ef-92ff-415b-a526-26711b88985f\") " pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.375971 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a9eb6ef-92ff-415b-a526-26711b88985f-utilities\") pod \"redhat-operators-mjcbm\" (UID: \"2a9eb6ef-92ff-415b-a526-26711b88985f\") " pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.376277 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.376379 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a9eb6ef-92ff-415b-a526-26711b88985f-catalog-content\") pod \"redhat-operators-mjcbm\" (UID: \"2a9eb6ef-92ff-415b-a526-26711b88985f\") " pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.376881 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.876832112 +0000 UTC m=+62.294178810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.380699 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcx8h" event={"ID":"a27ccd53-7ae0-4c98-9461-b545f841ea79","Type":"ContainerStarted","Data":"822281fc7c659db96eff7124c81359484a44641b3f919d676ff743dee7430497"} Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.404637 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b196fdc30426e9c5bbf81fba6592ea7a67a2d23d9f6acd11b52121df29d74d8c"} Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.477306 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.477490 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a9eb6ef-92ff-415b-a526-26711b88985f-catalog-content\") pod \"redhat-operators-mjcbm\" (UID: \"2a9eb6ef-92ff-415b-a526-26711b88985f\") " pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.477534 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2h92\" (UniqueName: \"kubernetes.io/projected/2a9eb6ef-92ff-415b-a526-26711b88985f-kube-api-access-r2h92\") pod \"redhat-operators-mjcbm\" (UID: \"2a9eb6ef-92ff-415b-a526-26711b88985f\") " pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.477608 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a9eb6ef-92ff-415b-a526-26711b88985f-utilities\") pod \"redhat-operators-mjcbm\" (UID: \"2a9eb6ef-92ff-415b-a526-26711b88985f\") " pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.478030 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a9eb6ef-92ff-415b-a526-26711b88985f-utilities\") pod \"redhat-operators-mjcbm\" (UID: \"2a9eb6ef-92ff-415b-a526-26711b88985f\") " pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.478120 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:15:59.978104263 +0000 UTC m=+62.395450961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.479331 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a9eb6ef-92ff-415b-a526-26711b88985f-catalog-content\") pod \"redhat-operators-mjcbm\" (UID: \"2a9eb6ef-92ff-415b-a526-26711b88985f\") " pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.495178 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.496282 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2h92\" (UniqueName: \"kubernetes.io/projected/2a9eb6ef-92ff-415b-a526-26711b88985f-kube-api-access-r2h92\") pod \"redhat-operators-mjcbm\" (UID: \"2a9eb6ef-92ff-415b-a526-26711b88985f\") " pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.566908 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.578462 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.578787 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:16:00.078774177 +0000 UTC m=+62.496120875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.589577 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hc475"] Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.626050 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hc475"] Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.626150 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.682773 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.682801 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:16:00.182776491 +0000 UTC m=+62.600123189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.683031 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa15b6c3-774f-4d31-8b55-008c3786d329-catalog-content\") pod \"redhat-operators-hc475\" (UID: \"fa15b6c3-774f-4d31-8b55-008c3786d329\") " pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.683077 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa15b6c3-774f-4d31-8b55-008c3786d329-utilities\") pod \"redhat-operators-hc475\" (UID: \"fa15b6c3-774f-4d31-8b55-008c3786d329\") " pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.683101 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h266r\" (UniqueName: \"kubernetes.io/projected/fa15b6c3-774f-4d31-8b55-008c3786d329-kube-api-access-h266r\") pod \"redhat-operators-hc475\" (UID: \"fa15b6c3-774f-4d31-8b55-008c3786d329\") " pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.683154 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.683603 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:16:00.183585752 +0000 UTC m=+62.600932450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.786402 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.786883 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa15b6c3-774f-4d31-8b55-008c3786d329-catalog-content\") pod \"redhat-operators-hc475\" (UID: \"fa15b6c3-774f-4d31-8b55-008c3786d329\") " pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.786909 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa15b6c3-774f-4d31-8b55-008c3786d329-utilities\") pod \"redhat-operators-hc475\" (UID: \"fa15b6c3-774f-4d31-8b55-008c3786d329\") " pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.786929 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h266r\" (UniqueName: \"kubernetes.io/projected/fa15b6c3-774f-4d31-8b55-008c3786d329-kube-api-access-h266r\") pod \"redhat-operators-hc475\" (UID: \"fa15b6c3-774f-4d31-8b55-008c3786d329\") " pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.787271 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:16:00.287255977 +0000 UTC m=+62.704602675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.787583 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa15b6c3-774f-4d31-8b55-008c3786d329-catalog-content\") pod \"redhat-operators-hc475\" (UID: \"fa15b6c3-774f-4d31-8b55-008c3786d329\") " pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.787854 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa15b6c3-774f-4d31-8b55-008c3786d329-utilities\") pod \"redhat-operators-hc475\" (UID: \"fa15b6c3-774f-4d31-8b55-008c3786d329\") " pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.801728 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:15:59 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:15:59 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:15:59 crc kubenswrapper[4628]: healthz check failed Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.801770 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.855276 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h266r\" (UniqueName: \"kubernetes.io/projected/fa15b6c3-774f-4d31-8b55-008c3786d329-kube-api-access-h266r\") pod \"redhat-operators-hc475\" (UID: \"fa15b6c3-774f-4d31-8b55-008c3786d329\") " pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.879470 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.888530 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.891220 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:16:00.391205629 +0000 UTC m=+62.808552327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:15:59 crc kubenswrapper[4628]: W1211 05:15:59.937760 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podece60403_400b_4da3_ab9c_b2030b94e0bf.slice/crio-d37d138f9ef43c90aa5ff916f920fe6ed7ecb4c2cf4cfaa131fead513e243ff6 WatchSource:0}: Error finding container d37d138f9ef43c90aa5ff916f920fe6ed7ecb4c2cf4cfaa131fead513e243ff6: Status 404 returned error can't find the container with id d37d138f9ef43c90aa5ff916f920fe6ed7ecb4c2cf4cfaa131fead513e243ff6 Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.963187 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:15:59 crc kubenswrapper[4628]: I1211 05:15:59.990296 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:15:59 crc kubenswrapper[4628]: E1211 05:15:59.990734 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:16:00.490716433 +0000 UTC m=+62.908063131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.098307 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:00 crc kubenswrapper[4628]: E1211 05:16:00.098610 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:16:00.59859906 +0000 UTC m=+63.015945758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.131725 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mjcbm"] Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.146646 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbrk4"] Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.206234 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:16:00 crc kubenswrapper[4628]: E1211 05:16:00.206334 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:16:00.706310863 +0000 UTC m=+63.123657561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.206529 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:00 crc kubenswrapper[4628]: E1211 05:16:00.206798 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:16:00.706790396 +0000 UTC m=+63.124137094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.310598 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:16:00 crc kubenswrapper[4628]: E1211 05:16:00.310868 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:16:00.81082444 +0000 UTC m=+63.228171138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.310952 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:00 crc kubenswrapper[4628]: E1211 05:16:00.311535 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:16:00.811525958 +0000 UTC m=+63.228872656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.346363 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nrc6p"] Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.404157 4628 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.412580 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:16:00 crc kubenswrapper[4628]: E1211 05:16:00.413126 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:16:00.913111198 +0000 UTC m=+63.330457886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.417494 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.418227 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.425686 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.425726 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.477149 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.500430 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbrk4" event={"ID":"eeedd8fe-1e9a-4009-a385-07d72fed1277","Type":"ContainerStarted","Data":"956ed7596c0d0812a671499e30cafa1b2174e9691cde28d8d8f5a251a7596803"} Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.515060 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hc475"] Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.516330 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6094aab-ea8b-4bc7-8899-a983c8d965ef-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b6094aab-ea8b-4bc7-8899-a983c8d965ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.516361 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.516389 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6094aab-ea8b-4bc7-8899-a983c8d965ef-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b6094aab-ea8b-4bc7-8899-a983c8d965ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 05:16:00 crc kubenswrapper[4628]: E1211 05:16:00.516661 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:16:01.016651489 +0000 UTC m=+63.433998187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.537691 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjcbm" event={"ID":"2a9eb6ef-92ff-415b-a526-26711b88985f","Type":"ContainerStarted","Data":"e021ca4df42fd3927f44c33b397a758b31acd76861c9c363a38a0fa3253e19e5"} Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.579304 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" event={"ID":"12356db6-09ae-438c-a085-6b26ea3b97e8","Type":"ContainerStarted","Data":"753bc1bcadc87a9cbcc286bef9a571fb7b0a876ae3a4284a720c6d19b8b50be5"} Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.585680 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ece60403-400b-4da3-ab9c-b2030b94e0bf","Type":"ContainerStarted","Data":"d37d138f9ef43c90aa5ff916f920fe6ed7ecb4c2cf4cfaa131fead513e243ff6"} Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.590644 4628 generic.go:334] "Generic (PLEG): container finished" podID="a27ccd53-7ae0-4c98-9461-b545f841ea79" containerID="c001a154c7e0c4df6998f947574f71c78e47031ea578290737e99fc4b5c40ece" exitCode=0 Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.591541 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcx8h" event={"ID":"a27ccd53-7ae0-4c98-9461-b545f841ea79","Type":"ContainerDied","Data":"c001a154c7e0c4df6998f947574f71c78e47031ea578290737e99fc4b5c40ece"} Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.597197 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"21a2dedf5cb24e5a5cc84a4e0565b169e9dd27e772ca45ff97c6ecc4c2d73d66"} Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.617175 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.617434 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6094aab-ea8b-4bc7-8899-a983c8d965ef-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b6094aab-ea8b-4bc7-8899-a983c8d965ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.617472 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6094aab-ea8b-4bc7-8899-a983c8d965ef-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b6094aab-ea8b-4bc7-8899-a983c8d965ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 05:16:00 crc kubenswrapper[4628]: E1211 05:16:00.617801 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:16:01.117787406 +0000 UTC m=+63.535134104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.621971 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6094aab-ea8b-4bc7-8899-a983c8d965ef-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b6094aab-ea8b-4bc7-8899-a983c8d965ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.622791 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nrc6p" event={"ID":"3953bb62-252a-4109-9ed9-f14294565e1d","Type":"ContainerStarted","Data":"485442daadc69375a5d13f5ba7339d029c890896988c5167d70efe4e4efe38f5"} Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.635833 4628 generic.go:334] "Generic (PLEG): container finished" podID="209cebdd-7761-42a6-9bf1-089cc06c3dca" containerID="ea28b50587a764088b354a86fa7c17a4227fe9490d614e081e53ac7aeb376396" exitCode=0 Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.635883 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" event={"ID":"209cebdd-7761-42a6-9bf1-089cc06c3dca","Type":"ContainerDied","Data":"ea28b50587a764088b354a86fa7c17a4227fe9490d614e081e53ac7aeb376396"} Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.688735 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6094aab-ea8b-4bc7-8899-a983c8d965ef-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b6094aab-ea8b-4bc7-8899-a983c8d965ef\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.720321 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:00 crc kubenswrapper[4628]: E1211 05:16:00.721407 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:16:01.221393979 +0000 UTC m=+63.638740677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.777611 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.821132 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:16:00 crc kubenswrapper[4628]: E1211 05:16:00.821563 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:16:01.32154789 +0000 UTC m=+63.738894588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.830105 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:16:00 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:16:00 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:16:00 crc kubenswrapper[4628]: healthz check failed Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.830151 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:16:00 crc kubenswrapper[4628]: I1211 05:16:00.922774 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:00 crc kubenswrapper[4628]: E1211 05:16:00.923117 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:16:01.423106309 +0000 UTC m=+63.840453007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.023243 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:16:01 crc kubenswrapper[4628]: E1211 05:16:01.023832 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-11 05:16:01.523816025 +0000 UTC m=+63.941162723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.105520 4628 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-11T05:16:00.40418153Z","Handler":null,"Name":""} Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.124297 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:01 crc kubenswrapper[4628]: E1211 05:16:01.124694 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-11 05:16:01.624683155 +0000 UTC m=+64.042029843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6qvrg" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.135815 4628 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.135868 4628 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.225735 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.228591 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.239996 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 11 05:16:01 crc kubenswrapper[4628]: W1211 05:16:01.276775 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb6094aab_ea8b_4bc7_8899_a983c8d965ef.slice/crio-5e7ca66467532ea2f465ff3b738b9f8922fa38c8ee7805b5f74cc9515fb064cc WatchSource:0}: Error finding container 5e7ca66467532ea2f465ff3b738b9f8922fa38c8ee7805b5f74cc9515fb064cc: Status 404 returned error can't find the container with id 5e7ca66467532ea2f465ff3b738b9f8922fa38c8ee7805b5f74cc9515fb064cc Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.277594 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.282494 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-n4h96" Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.328357 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.337999 4628 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.338048 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.506013 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6qvrg\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.531475 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.540772 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.627868 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.744338 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" event={"ID":"12356db6-09ae-438c-a085-6b26ea3b97e8","Type":"ContainerStarted","Data":"b78183dab9947be46ad29c264852cec18bef22d9fa2ba95198505b4da19a6bce"} Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.749885 4628 generic.go:334] "Generic (PLEG): container finished" podID="ece60403-400b-4da3-ab9c-b2030b94e0bf" containerID="e6af01b35d94fb78863b0a114a2c31a6ae883aa30ebe205314c9408bfc911972" exitCode=0 Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.749923 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ece60403-400b-4da3-ab9c-b2030b94e0bf","Type":"ContainerDied","Data":"e6af01b35d94fb78863b0a114a2c31a6ae883aa30ebe205314c9408bfc911972"} Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.776464 4628 generic.go:334] "Generic (PLEG): container finished" podID="fa15b6c3-774f-4d31-8b55-008c3786d329" containerID="7548293186cbe858d548c38beeff0a2cec9d088f66ca33206629e264a366f749" exitCode=0 Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.776535 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc475" event={"ID":"fa15b6c3-774f-4d31-8b55-008c3786d329","Type":"ContainerDied","Data":"7548293186cbe858d548c38beeff0a2cec9d088f66ca33206629e264a366f749"} Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.776562 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc475" event={"ID":"fa15b6c3-774f-4d31-8b55-008c3786d329","Type":"ContainerStarted","Data":"49e497a0cd159195b67846b75c0b9b306a2849e98f92e5e0fc9aa570bd09494c"} Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.789979 4628 generic.go:334] "Generic (PLEG): container finished" podID="3953bb62-252a-4109-9ed9-f14294565e1d" containerID="0fd98bd50b282ee42ebc696d1287d539b8df5fb99f2ec1ebfdcb7fc971cfdd6b" exitCode=0 Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.790962 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nrc6p" event={"ID":"3953bb62-252a-4109-9ed9-f14294565e1d","Type":"ContainerDied","Data":"0fd98bd50b282ee42ebc696d1287d539b8df5fb99f2ec1ebfdcb7fc971cfdd6b"} Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.795562 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:16:01 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:16:01 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:16:01 crc kubenswrapper[4628]: healthz check failed Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.795614 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.808353 4628 generic.go:334] "Generic (PLEG): container finished" podID="eeedd8fe-1e9a-4009-a385-07d72fed1277" containerID="34d85b8972bda27137ed5e69adf8197995290ad2c08f832f85adc69aa44b690a" exitCode=0 Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.808408 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbrk4" event={"ID":"eeedd8fe-1e9a-4009-a385-07d72fed1277","Type":"ContainerDied","Data":"34d85b8972bda27137ed5e69adf8197995290ad2c08f832f85adc69aa44b690a"} Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.819461 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b6094aab-ea8b-4bc7-8899-a983c8d965ef","Type":"ContainerStarted","Data":"5e7ca66467532ea2f465ff3b738b9f8922fa38c8ee7805b5f74cc9515fb064cc"} Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.839169 4628 generic.go:334] "Generic (PLEG): container finished" podID="2a9eb6ef-92ff-415b-a526-26711b88985f" containerID="5238591181beb8227757f3be992574259e6f93106e01c5f6c313116e19ae4c76" exitCode=0 Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.839535 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjcbm" event={"ID":"2a9eb6ef-92ff-415b-a526-26711b88985f","Type":"ContainerDied","Data":"5238591181beb8227757f3be992574259e6f93106e01c5f6c313116e19ae4c76"} Dec 11 05:16:01 crc kubenswrapper[4628]: I1211 05:16:01.921797 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.425315 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.458620 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6qvrg"] Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.570236 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxv6g\" (UniqueName: \"kubernetes.io/projected/209cebdd-7761-42a6-9bf1-089cc06c3dca-kube-api-access-fxv6g\") pod \"209cebdd-7761-42a6-9bf1-089cc06c3dca\" (UID: \"209cebdd-7761-42a6-9bf1-089cc06c3dca\") " Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.570305 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/209cebdd-7761-42a6-9bf1-089cc06c3dca-secret-volume\") pod \"209cebdd-7761-42a6-9bf1-089cc06c3dca\" (UID: \"209cebdd-7761-42a6-9bf1-089cc06c3dca\") " Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.570370 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/209cebdd-7761-42a6-9bf1-089cc06c3dca-config-volume\") pod \"209cebdd-7761-42a6-9bf1-089cc06c3dca\" (UID: \"209cebdd-7761-42a6-9bf1-089cc06c3dca\") " Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.572352 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/209cebdd-7761-42a6-9bf1-089cc06c3dca-config-volume" (OuterVolumeSpecName: "config-volume") pod "209cebdd-7761-42a6-9bf1-089cc06c3dca" (UID: "209cebdd-7761-42a6-9bf1-089cc06c3dca"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.577120 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/209cebdd-7761-42a6-9bf1-089cc06c3dca-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "209cebdd-7761-42a6-9bf1-089cc06c3dca" (UID: "209cebdd-7761-42a6-9bf1-089cc06c3dca"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.587136 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/209cebdd-7761-42a6-9bf1-089cc06c3dca-kube-api-access-fxv6g" (OuterVolumeSpecName: "kube-api-access-fxv6g") pod "209cebdd-7761-42a6-9bf1-089cc06c3dca" (UID: "209cebdd-7761-42a6-9bf1-089cc06c3dca"). InnerVolumeSpecName "kube-api-access-fxv6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.672501 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxv6g\" (UniqueName: \"kubernetes.io/projected/209cebdd-7761-42a6-9bf1-089cc06c3dca-kube-api-access-fxv6g\") on node \"crc\" DevicePath \"\"" Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.672549 4628 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/209cebdd-7761-42a6-9bf1-089cc06c3dca-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.672560 4628 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/209cebdd-7761-42a6-9bf1-089cc06c3dca-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.691106 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7kdnp" Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.794544 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:16:02 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:16:02 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:16:02 crc kubenswrapper[4628]: healthz check failed Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.794587 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.876106 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" event={"ID":"12356db6-09ae-438c-a085-6b26ea3b97e8","Type":"ContainerStarted","Data":"2af103483587e3694c2caf8ef8c72a2ed563dcbb57a35f60f85f016eb4a24c4e"} Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.882750 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" event={"ID":"209cebdd-7761-42a6-9bf1-089cc06c3dca","Type":"ContainerDied","Data":"994ae7bde5429dc153d8ed22126ee2ce8f512612a1ebe3a917f4346646ac13c7"} Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.882758 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch" Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.882924 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="994ae7bde5429dc153d8ed22126ee2ce8f512612a1ebe3a917f4346646ac13c7" Dec 11 05:16:02 crc kubenswrapper[4628]: I1211 05:16:02.901039 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-cjb5x" podStartSLOduration=18.901026467 podStartE2EDuration="18.901026467s" podCreationTimestamp="2025-12-11 05:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:16:02.899338161 +0000 UTC m=+65.316684859" watchObservedRunningTime="2025-12-11 05:16:02.901026467 +0000 UTC m=+65.318373165" Dec 11 05:16:03 crc kubenswrapper[4628]: W1211 05:16:03.435277 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa963e29_dda2_4d61_827f_2da2d53bfe52.slice/crio-a031c752421dd5ade06805969648aca8d48fa22fb298c058fcd6776f6598771f WatchSource:0}: Error finding container a031c752421dd5ade06805969648aca8d48fa22fb298c058fcd6776f6598771f: Status 404 returned error can't find the container with id a031c752421dd5ade06805969648aca8d48fa22fb298c058fcd6776f6598771f Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.507230 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.595495 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ece60403-400b-4da3-ab9c-b2030b94e0bf-kube-api-access\") pod \"ece60403-400b-4da3-ab9c-b2030b94e0bf\" (UID: \"ece60403-400b-4da3-ab9c-b2030b94e0bf\") " Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.595552 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ece60403-400b-4da3-ab9c-b2030b94e0bf-kubelet-dir\") pod \"ece60403-400b-4da3-ab9c-b2030b94e0bf\" (UID: \"ece60403-400b-4da3-ab9c-b2030b94e0bf\") " Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.595925 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ece60403-400b-4da3-ab9c-b2030b94e0bf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ece60403-400b-4da3-ab9c-b2030b94e0bf" (UID: "ece60403-400b-4da3-ab9c-b2030b94e0bf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.607964 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece60403-400b-4da3-ab9c-b2030b94e0bf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ece60403-400b-4da3-ab9c-b2030b94e0bf" (UID: "ece60403-400b-4da3-ab9c-b2030b94e0bf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.698060 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ece60403-400b-4da3-ab9c-b2030b94e0bf-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.698093 4628 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ece60403-400b-4da3-ab9c-b2030b94e0bf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.797403 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:16:03 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:16:03 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:16:03 crc kubenswrapper[4628]: healthz check failed Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.797555 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.905292 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" event={"ID":"fa963e29-dda2-4d61-827f-2da2d53bfe52","Type":"ContainerStarted","Data":"c230ba630b8d97f4a4826e1ff29a086a5a0da65b17f686c25ca5f3c18789357b"} Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.905612 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.905647 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" event={"ID":"fa963e29-dda2-4d61-827f-2da2d53bfe52","Type":"ContainerStarted","Data":"a031c752421dd5ade06805969648aca8d48fa22fb298c058fcd6776f6598771f"} Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.915938 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b6094aab-ea8b-4bc7-8899-a983c8d965ef","Type":"ContainerStarted","Data":"2624c1065844991b6b468cb10db8d2f770acac43cbdb7d501c5ece516cb096c4"} Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.923347 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.923599 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ece60403-400b-4da3-ab9c-b2030b94e0bf","Type":"ContainerDied","Data":"d37d138f9ef43c90aa5ff916f920fe6ed7ecb4c2cf4cfaa131fead513e243ff6"} Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.923619 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d37d138f9ef43c90aa5ff916f920fe6ed7ecb4c2cf4cfaa131fead513e243ff6" Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.937309 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" podStartSLOduration=32.937289032 podStartE2EDuration="32.937289032s" podCreationTimestamp="2025-12-11 05:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:16:03.931446667 +0000 UTC m=+66.348793365" watchObservedRunningTime="2025-12-11 05:16:03.937289032 +0000 UTC m=+66.354635730" Dec 11 05:16:03 crc kubenswrapper[4628]: I1211 05:16:03.960731 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.960715267 podStartE2EDuration="3.960715267s" podCreationTimestamp="2025-12-11 05:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:16:03.959414323 +0000 UTC m=+66.376761021" watchObservedRunningTime="2025-12-11 05:16:03.960715267 +0000 UTC m=+66.378061965" Dec 11 05:16:04 crc kubenswrapper[4628]: I1211 05:16:04.795195 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:16:04 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:16:04 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:16:04 crc kubenswrapper[4628]: healthz check failed Dec 11 05:16:04 crc kubenswrapper[4628]: I1211 05:16:04.795549 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:16:04 crc kubenswrapper[4628]: I1211 05:16:04.963174 4628 generic.go:334] "Generic (PLEG): container finished" podID="b6094aab-ea8b-4bc7-8899-a983c8d965ef" containerID="2624c1065844991b6b468cb10db8d2f770acac43cbdb7d501c5ece516cb096c4" exitCode=0 Dec 11 05:16:04 crc kubenswrapper[4628]: I1211 05:16:04.963307 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b6094aab-ea8b-4bc7-8899-a983c8d965ef","Type":"ContainerDied","Data":"2624c1065844991b6b468cb10db8d2f770acac43cbdb7d501c5ece516cb096c4"} Dec 11 05:16:05 crc kubenswrapper[4628]: I1211 05:16:05.796612 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:16:05 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:16:05 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:16:05 crc kubenswrapper[4628]: healthz check failed Dec 11 05:16:05 crc kubenswrapper[4628]: I1211 05:16:05.796675 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:16:06 crc kubenswrapper[4628]: I1211 05:16:06.828066 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:16:06 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:16:06 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:16:06 crc kubenswrapper[4628]: healthz check failed Dec 11 05:16:06 crc kubenswrapper[4628]: I1211 05:16:06.828376 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:16:06 crc kubenswrapper[4628]: I1211 05:16:06.855317 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 05:16:06 crc kubenswrapper[4628]: I1211 05:16:06.871618 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6094aab-ea8b-4bc7-8899-a983c8d965ef-kube-api-access\") pod \"b6094aab-ea8b-4bc7-8899-a983c8d965ef\" (UID: \"b6094aab-ea8b-4bc7-8899-a983c8d965ef\") " Dec 11 05:16:06 crc kubenswrapper[4628]: I1211 05:16:06.871733 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6094aab-ea8b-4bc7-8899-a983c8d965ef-kubelet-dir\") pod \"b6094aab-ea8b-4bc7-8899-a983c8d965ef\" (UID: \"b6094aab-ea8b-4bc7-8899-a983c8d965ef\") " Dec 11 05:16:06 crc kubenswrapper[4628]: I1211 05:16:06.872825 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6094aab-ea8b-4bc7-8899-a983c8d965ef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b6094aab-ea8b-4bc7-8899-a983c8d965ef" (UID: "b6094aab-ea8b-4bc7-8899-a983c8d965ef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:16:06 crc kubenswrapper[4628]: I1211 05:16:06.880055 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6094aab-ea8b-4bc7-8899-a983c8d965ef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b6094aab-ea8b-4bc7-8899-a983c8d965ef" (UID: "b6094aab-ea8b-4bc7-8899-a983c8d965ef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:16:06 crc kubenswrapper[4628]: I1211 05:16:06.975439 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6094aab-ea8b-4bc7-8899-a983c8d965ef-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 05:16:06 crc kubenswrapper[4628]: I1211 05:16:06.975668 4628 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6094aab-ea8b-4bc7-8899-a983c8d965ef-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 11 05:16:07 crc kubenswrapper[4628]: I1211 05:16:07.008918 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b6094aab-ea8b-4bc7-8899-a983c8d965ef","Type":"ContainerDied","Data":"5e7ca66467532ea2f465ff3b738b9f8922fa38c8ee7805b5f74cc9515fb064cc"} Dec 11 05:16:07 crc kubenswrapper[4628]: I1211 05:16:07.008952 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e7ca66467532ea2f465ff3b738b9f8922fa38c8ee7805b5f74cc9515fb064cc" Dec 11 05:16:07 crc kubenswrapper[4628]: I1211 05:16:07.009011 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 11 05:16:07 crc kubenswrapper[4628]: I1211 05:16:07.285609 4628 patch_prober.go:28] interesting pod/console-f9d7485db-4nw5h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 11 05:16:07 crc kubenswrapper[4628]: I1211 05:16:07.285657 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4nw5h" podUID="5111b417-34a8-405f-a0b8-eab04e144ff8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 11 05:16:07 crc kubenswrapper[4628]: I1211 05:16:07.794371 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:16:07 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:16:07 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:16:07 crc kubenswrapper[4628]: healthz check failed Dec 11 05:16:07 crc kubenswrapper[4628]: I1211 05:16:07.794437 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:16:07 crc kubenswrapper[4628]: I1211 05:16:07.840633 4628 patch_prober.go:28] interesting pod/downloads-7954f5f757-xsldw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 11 05:16:07 crc kubenswrapper[4628]: I1211 05:16:07.840688 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xsldw" podUID="a73e280a-008c-4e72-8844-375de50d4222" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 11 05:16:07 crc kubenswrapper[4628]: I1211 05:16:07.840712 4628 patch_prober.go:28] interesting pod/downloads-7954f5f757-xsldw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 11 05:16:07 crc kubenswrapper[4628]: I1211 05:16:07.840760 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xsldw" podUID="a73e280a-008c-4e72-8844-375de50d4222" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 11 05:16:07 crc kubenswrapper[4628]: E1211 05:16:07.863957 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:07 crc kubenswrapper[4628]: E1211 05:16:07.867583 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:07 crc kubenswrapper[4628]: E1211 05:16:07.904838 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:07 crc kubenswrapper[4628]: E1211 05:16:07.904933 4628 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" podUID="1eefadbf-ac92-4b97-999e-fb262b5d45c2" containerName="kube-multus-additional-cni-plugins" Dec 11 05:16:08 crc kubenswrapper[4628]: I1211 05:16:08.798996 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:16:08 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:16:08 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:16:08 crc kubenswrapper[4628]: healthz check failed Dec 11 05:16:08 crc kubenswrapper[4628]: I1211 05:16:08.799342 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:16:09 crc kubenswrapper[4628]: I1211 05:16:09.794575 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:16:09 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:16:09 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:16:09 crc kubenswrapper[4628]: healthz check failed Dec 11 05:16:09 crc kubenswrapper[4628]: I1211 05:16:09.794665 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:16:10 crc kubenswrapper[4628]: I1211 05:16:10.794273 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:16:10 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:16:10 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:16:10 crc kubenswrapper[4628]: healthz check failed Dec 11 05:16:10 crc kubenswrapper[4628]: I1211 05:16:10.794563 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:16:11 crc kubenswrapper[4628]: I1211 05:16:11.805396 4628 patch_prober.go:28] interesting pod/router-default-5444994796-8z6mf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 11 05:16:11 crc kubenswrapper[4628]: [-]has-synced failed: reason withheld Dec 11 05:16:11 crc kubenswrapper[4628]: [+]process-running ok Dec 11 05:16:11 crc kubenswrapper[4628]: healthz check failed Dec 11 05:16:11 crc kubenswrapper[4628]: I1211 05:16:11.805455 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8z6mf" podUID="b1dac6ca-2acb-4ec2-bd04-c307aa26c17f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:16:12 crc kubenswrapper[4628]: I1211 05:16:12.794110 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:16:12 crc kubenswrapper[4628]: I1211 05:16:12.796952 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8z6mf" Dec 11 05:16:17 crc kubenswrapper[4628]: I1211 05:16:17.697313 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:16:17 crc kubenswrapper[4628]: I1211 05:16:17.702694 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:16:17 crc kubenswrapper[4628]: I1211 05:16:17.845426 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-xsldw" Dec 11 05:16:17 crc kubenswrapper[4628]: E1211 05:16:17.867905 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:17 crc kubenswrapper[4628]: E1211 05:16:17.874668 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:17 crc kubenswrapper[4628]: E1211 05:16:17.881905 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:17 crc kubenswrapper[4628]: E1211 05:16:17.881948 4628 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" podUID="1eefadbf-ac92-4b97-999e-fb262b5d45c2" containerName="kube-multus-additional-cni-plugins" Dec 11 05:16:21 crc kubenswrapper[4628]: I1211 05:16:21.548975 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:16:25 crc kubenswrapper[4628]: I1211 05:16:25.904424 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 11 05:16:27 crc kubenswrapper[4628]: E1211 05:16:27.862496 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63 is running failed: container process not found" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:27 crc kubenswrapper[4628]: E1211 05:16:27.863376 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63 is running failed: container process not found" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:27 crc kubenswrapper[4628]: E1211 05:16:27.864271 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63 is running failed: container process not found" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:27 crc kubenswrapper[4628]: E1211 05:16:27.864375 4628 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" podUID="1eefadbf-ac92-4b97-999e-fb262b5d45c2" containerName="kube-multus-additional-cni-plugins" Dec 11 05:16:27 crc kubenswrapper[4628]: I1211 05:16:27.924605 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.924574661 podStartE2EDuration="2.924574661s" podCreationTimestamp="2025-12-11 05:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:16:27.914377877 +0000 UTC m=+90.331724605" watchObservedRunningTime="2025-12-11 05:16:27.924574661 +0000 UTC m=+90.341921389" Dec 11 05:16:27 crc kubenswrapper[4628]: I1211 05:16:27.935493 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 11 05:16:28 crc kubenswrapper[4628]: I1211 05:16:28.183943 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-f954d" Dec 11 05:16:28 crc kubenswrapper[4628]: I1211 05:16:28.269871 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.2698547279999999 podStartE2EDuration="1.269854728s" podCreationTimestamp="2025-12-11 05:16:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:16:28.239076848 +0000 UTC m=+90.656423556" watchObservedRunningTime="2025-12-11 05:16:28.269854728 +0000 UTC m=+90.687201426" Dec 11 05:16:31 crc kubenswrapper[4628]: I1211 05:16:31.319665 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-whf6x_1eefadbf-ac92-4b97-999e-fb262b5d45c2/kube-multus-additional-cni-plugins/0.log" Dec 11 05:16:31 crc kubenswrapper[4628]: I1211 05:16:31.320058 4628 generic.go:334] "Generic (PLEG): container finished" podID="1eefadbf-ac92-4b97-999e-fb262b5d45c2" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" exitCode=137 Dec 11 05:16:31 crc kubenswrapper[4628]: I1211 05:16:31.320103 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" event={"ID":"1eefadbf-ac92-4b97-999e-fb262b5d45c2","Type":"ContainerDied","Data":"28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63"} Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.599869 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 11 05:16:33 crc kubenswrapper[4628]: E1211 05:16:33.600077 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6094aab-ea8b-4bc7-8899-a983c8d965ef" containerName="pruner" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.600089 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6094aab-ea8b-4bc7-8899-a983c8d965ef" containerName="pruner" Dec 11 05:16:33 crc kubenswrapper[4628]: E1211 05:16:33.600100 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209cebdd-7761-42a6-9bf1-089cc06c3dca" containerName="collect-profiles" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.600107 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="209cebdd-7761-42a6-9bf1-089cc06c3dca" containerName="collect-profiles" Dec 11 05:16:33 crc kubenswrapper[4628]: E1211 05:16:33.600120 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece60403-400b-4da3-ab9c-b2030b94e0bf" containerName="pruner" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.600127 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece60403-400b-4da3-ab9c-b2030b94e0bf" containerName="pruner" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.600221 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6094aab-ea8b-4bc7-8899-a983c8d965ef" containerName="pruner" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.600236 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="209cebdd-7761-42a6-9bf1-089cc06c3dca" containerName="collect-profiles" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.600246 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece60403-400b-4da3-ab9c-b2030b94e0bf" containerName="pruner" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.600615 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.604302 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.604513 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.614293 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.704698 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c556f8d-9095-4617-beb5-a9fdc918a365-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5c556f8d-9095-4617-beb5-a9fdc918a365\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.704790 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c556f8d-9095-4617-beb5-a9fdc918a365-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5c556f8d-9095-4617-beb5-a9fdc918a365\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.806489 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c556f8d-9095-4617-beb5-a9fdc918a365-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5c556f8d-9095-4617-beb5-a9fdc918a365\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.806659 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c556f8d-9095-4617-beb5-a9fdc918a365-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5c556f8d-9095-4617-beb5-a9fdc918a365\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.806660 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c556f8d-9095-4617-beb5-a9fdc918a365-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5c556f8d-9095-4617-beb5-a9fdc918a365\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.826504 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c556f8d-9095-4617-beb5-a9fdc918a365-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5c556f8d-9095-4617-beb5-a9fdc918a365\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 05:16:33 crc kubenswrapper[4628]: I1211 05:16:33.917827 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 05:16:36 crc kubenswrapper[4628]: I1211 05:16:36.277445 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 11 05:16:37 crc kubenswrapper[4628]: E1211 05:16:37.863281 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63 is running failed: container process not found" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:37 crc kubenswrapper[4628]: E1211 05:16:37.863841 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63 is running failed: container process not found" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:37 crc kubenswrapper[4628]: E1211 05:16:37.864398 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63 is running failed: container process not found" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:37 crc kubenswrapper[4628]: E1211 05:16:37.864470 4628 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" podUID="1eefadbf-ac92-4b97-999e-fb262b5d45c2" containerName="kube-multus-additional-cni-plugins" Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.408928 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.411075 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.429630 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.507233 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/843a2bbd-6914-4686-a14e-f05f88ddcc07-kubelet-dir\") pod \"installer-9-crc\" (UID: \"843a2bbd-6914-4686-a14e-f05f88ddcc07\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.507516 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/843a2bbd-6914-4686-a14e-f05f88ddcc07-kube-api-access\") pod \"installer-9-crc\" (UID: \"843a2bbd-6914-4686-a14e-f05f88ddcc07\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.507581 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/843a2bbd-6914-4686-a14e-f05f88ddcc07-var-lock\") pod \"installer-9-crc\" (UID: \"843a2bbd-6914-4686-a14e-f05f88ddcc07\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.609214 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/843a2bbd-6914-4686-a14e-f05f88ddcc07-kube-api-access\") pod \"installer-9-crc\" (UID: \"843a2bbd-6914-4686-a14e-f05f88ddcc07\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.609269 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/843a2bbd-6914-4686-a14e-f05f88ddcc07-var-lock\") pod \"installer-9-crc\" (UID: \"843a2bbd-6914-4686-a14e-f05f88ddcc07\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.609319 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/843a2bbd-6914-4686-a14e-f05f88ddcc07-kubelet-dir\") pod \"installer-9-crc\" (UID: \"843a2bbd-6914-4686-a14e-f05f88ddcc07\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.609412 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/843a2bbd-6914-4686-a14e-f05f88ddcc07-kubelet-dir\") pod \"installer-9-crc\" (UID: \"843a2bbd-6914-4686-a14e-f05f88ddcc07\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.609496 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/843a2bbd-6914-4686-a14e-f05f88ddcc07-var-lock\") pod \"installer-9-crc\" (UID: \"843a2bbd-6914-4686-a14e-f05f88ddcc07\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.633567 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/843a2bbd-6914-4686-a14e-f05f88ddcc07-kube-api-access\") pod \"installer-9-crc\" (UID: \"843a2bbd-6914-4686-a14e-f05f88ddcc07\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:16:39 crc kubenswrapper[4628]: I1211 05:16:39.751811 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:16:47 crc kubenswrapper[4628]: E1211 05:16:47.862469 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63 is running failed: container process not found" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:47 crc kubenswrapper[4628]: E1211 05:16:47.863771 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63 is running failed: container process not found" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:47 crc kubenswrapper[4628]: E1211 05:16:47.864428 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63 is running failed: container process not found" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 11 05:16:47 crc kubenswrapper[4628]: E1211 05:16:47.864485 4628 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" podUID="1eefadbf-ac92-4b97-999e-fb262b5d45c2" containerName="kube-multus-additional-cni-plugins" Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.738210 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-whf6x_1eefadbf-ac92-4b97-999e-fb262b5d45c2/kube-multus-additional-cni-plugins/0.log" Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.738803 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.919863 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1eefadbf-ac92-4b97-999e-fb262b5d45c2-cni-sysctl-allowlist\") pod \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.920084 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1eefadbf-ac92-4b97-999e-fb262b5d45c2-tuning-conf-dir\") pod \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.920120 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1eefadbf-ac92-4b97-999e-fb262b5d45c2-ready\") pod \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.920491 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1eefadbf-ac92-4b97-999e-fb262b5d45c2-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "1eefadbf-ac92-4b97-999e-fb262b5d45c2" (UID: "1eefadbf-ac92-4b97-999e-fb262b5d45c2"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.920552 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eefadbf-ac92-4b97-999e-fb262b5d45c2-ready" (OuterVolumeSpecName: "ready") pod "1eefadbf-ac92-4b97-999e-fb262b5d45c2" (UID: "1eefadbf-ac92-4b97-999e-fb262b5d45c2"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.920653 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgkkc\" (UniqueName: \"kubernetes.io/projected/1eefadbf-ac92-4b97-999e-fb262b5d45c2-kube-api-access-kgkkc\") pod \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\" (UID: \"1eefadbf-ac92-4b97-999e-fb262b5d45c2\") " Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.920738 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eefadbf-ac92-4b97-999e-fb262b5d45c2-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "1eefadbf-ac92-4b97-999e-fb262b5d45c2" (UID: "1eefadbf-ac92-4b97-999e-fb262b5d45c2"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.921061 4628 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1eefadbf-ac92-4b97-999e-fb262b5d45c2-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.921074 4628 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1eefadbf-ac92-4b97-999e-fb262b5d45c2-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.921083 4628 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/1eefadbf-ac92-4b97-999e-fb262b5d45c2-ready\") on node \"crc\" DevicePath \"\"" Dec 11 05:16:52 crc kubenswrapper[4628]: I1211 05:16:52.927834 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eefadbf-ac92-4b97-999e-fb262b5d45c2-kube-api-access-kgkkc" (OuterVolumeSpecName: "kube-api-access-kgkkc") pod "1eefadbf-ac92-4b97-999e-fb262b5d45c2" (UID: "1eefadbf-ac92-4b97-999e-fb262b5d45c2"). InnerVolumeSpecName "kube-api-access-kgkkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:16:53 crc kubenswrapper[4628]: I1211 05:16:53.023033 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgkkc\" (UniqueName: \"kubernetes.io/projected/1eefadbf-ac92-4b97-999e-fb262b5d45c2-kube-api-access-kgkkc\") on node \"crc\" DevicePath \"\"" Dec 11 05:16:53 crc kubenswrapper[4628]: I1211 05:16:53.473638 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-whf6x_1eefadbf-ac92-4b97-999e-fb262b5d45c2/kube-multus-additional-cni-plugins/0.log" Dec 11 05:16:53 crc kubenswrapper[4628]: I1211 05:16:53.473687 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" event={"ID":"1eefadbf-ac92-4b97-999e-fb262b5d45c2","Type":"ContainerDied","Data":"8cdde9458cef48bbd9c4dbe1b5ebc960e1d346cf76c9c20a6bed01e037e156c8"} Dec 11 05:16:53 crc kubenswrapper[4628]: I1211 05:16:53.473726 4628 scope.go:117] "RemoveContainer" containerID="28aed3c92127b5d244aaeb0404000800889eabfd11738fdae333e856551a0a63" Dec 11 05:16:53 crc kubenswrapper[4628]: I1211 05:16:53.473764 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-whf6x" Dec 11 05:16:53 crc kubenswrapper[4628]: I1211 05:16:53.506425 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-whf6x"] Dec 11 05:16:53 crc kubenswrapper[4628]: I1211 05:16:53.510027 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-whf6x"] Dec 11 05:16:53 crc kubenswrapper[4628]: I1211 05:16:53.895207 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eefadbf-ac92-4b97-999e-fb262b5d45c2" path="/var/lib/kubelet/pods/1eefadbf-ac92-4b97-999e-fb262b5d45c2/volumes" Dec 11 05:17:01 crc kubenswrapper[4628]: E1211 05:17:01.370051 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 11 05:17:01 crc kubenswrapper[4628]: E1211 05:17:01.370775 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cf68p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hjbgr_openshift-marketplace(445e77bd-611f-486b-af50-16e4476e29e4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 05:17:01 crc kubenswrapper[4628]: E1211 05:17:01.372679 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hjbgr" podUID="445e77bd-611f-486b-af50-16e4476e29e4" Dec 11 05:17:01 crc kubenswrapper[4628]: I1211 05:17:01.610236 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wg98b"] Dec 11 05:17:04 crc kubenswrapper[4628]: E1211 05:17:04.353126 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hjbgr" podUID="445e77bd-611f-486b-af50-16e4476e29e4" Dec 11 05:17:05 crc kubenswrapper[4628]: E1211 05:17:05.268278 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 11 05:17:05 crc kubenswrapper[4628]: E1211 05:17:05.268415 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cgnh4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rbrk4_openshift-marketplace(eeedd8fe-1e9a-4009-a385-07d72fed1277): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 05:17:05 crc kubenswrapper[4628]: E1211 05:17:05.269586 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rbrk4" podUID="eeedd8fe-1e9a-4009-a385-07d72fed1277" Dec 11 05:17:05 crc kubenswrapper[4628]: E1211 05:17:05.358098 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 11 05:17:05 crc kubenswrapper[4628]: E1211 05:17:05.358253 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bh82m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nrc6p_openshift-marketplace(3953bb62-252a-4109-9ed9-f14294565e1d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 05:17:05 crc kubenswrapper[4628]: E1211 05:17:05.359434 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nrc6p" podUID="3953bb62-252a-4109-9ed9-f14294565e1d" Dec 11 05:17:06 crc kubenswrapper[4628]: E1211 05:17:06.400111 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nrc6p" podUID="3953bb62-252a-4109-9ed9-f14294565e1d" Dec 11 05:17:06 crc kubenswrapper[4628]: E1211 05:17:06.400155 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rbrk4" podUID="eeedd8fe-1e9a-4009-a385-07d72fed1277" Dec 11 05:17:06 crc kubenswrapper[4628]: E1211 05:17:06.485471 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 11 05:17:06 crc kubenswrapper[4628]: E1211 05:17:06.485598 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9swxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7m4g7_openshift-marketplace(3d5f88ec-256a-4556-b06b-814dfa23c87b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 05:17:06 crc kubenswrapper[4628]: E1211 05:17:06.486727 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7m4g7" podUID="3d5f88ec-256a-4556-b06b-814dfa23c87b" Dec 11 05:17:06 crc kubenswrapper[4628]: E1211 05:17:06.493382 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 11 05:17:06 crc kubenswrapper[4628]: E1211 05:17:06.493684 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8hqpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-k7hq5_openshift-marketplace(b719f066-3997-4c70-bfb1-b489c56e2ef4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 05:17:06 crc kubenswrapper[4628]: E1211 05:17:06.495585 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-k7hq5" podUID="b719f066-3997-4c70-bfb1-b489c56e2ef4" Dec 11 05:17:06 crc kubenswrapper[4628]: E1211 05:17:06.511277 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 11 05:17:06 crc kubenswrapper[4628]: E1211 05:17:06.511444 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m597z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dcx8h_openshift-marketplace(a27ccd53-7ae0-4c98-9461-b545f841ea79): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 05:17:06 crc kubenswrapper[4628]: E1211 05:17:06.512773 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dcx8h" podUID="a27ccd53-7ae0-4c98-9461-b545f841ea79" Dec 11 05:17:14 crc kubenswrapper[4628]: E1211 05:17:14.284482 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dcx8h" podUID="a27ccd53-7ae0-4c98-9461-b545f841ea79" Dec 11 05:17:14 crc kubenswrapper[4628]: E1211 05:17:14.284971 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7m4g7" podUID="3d5f88ec-256a-4556-b06b-814dfa23c87b" Dec 11 05:17:14 crc kubenswrapper[4628]: E1211 05:17:14.285061 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-k7hq5" podUID="b719f066-3997-4c70-bfb1-b489c56e2ef4" Dec 11 05:17:14 crc kubenswrapper[4628]: I1211 05:17:14.737198 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 11 05:17:14 crc kubenswrapper[4628]: W1211 05:17:14.747868 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5c556f8d_9095_4617_beb5_a9fdc918a365.slice/crio-93a57479ee39e0ba0cb70b0bc3e73cde3d0f601ca78688c47d96a73a7133ce7c WatchSource:0}: Error finding container 93a57479ee39e0ba0cb70b0bc3e73cde3d0f601ca78688c47d96a73a7133ce7c: Status 404 returned error can't find the container with id 93a57479ee39e0ba0cb70b0bc3e73cde3d0f601ca78688c47d96a73a7133ce7c Dec 11 05:17:14 crc kubenswrapper[4628]: I1211 05:17:14.748977 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 11 05:17:14 crc kubenswrapper[4628]: W1211 05:17:14.755144 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod843a2bbd_6914_4686_a14e_f05f88ddcc07.slice/crio-4361ca9db65c08e41482440c49ed418b60376c136e650eaef275916633e53e66 WatchSource:0}: Error finding container 4361ca9db65c08e41482440c49ed418b60376c136e650eaef275916633e53e66: Status 404 returned error can't find the container with id 4361ca9db65c08e41482440c49ed418b60376c136e650eaef275916633e53e66 Dec 11 05:17:15 crc kubenswrapper[4628]: I1211 05:17:15.657902 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"843a2bbd-6914-4686-a14e-f05f88ddcc07","Type":"ContainerStarted","Data":"4361ca9db65c08e41482440c49ed418b60376c136e650eaef275916633e53e66"} Dec 11 05:17:15 crc kubenswrapper[4628]: I1211 05:17:15.658689 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5c556f8d-9095-4617-beb5-a9fdc918a365","Type":"ContainerStarted","Data":"93a57479ee39e0ba0cb70b0bc3e73cde3d0f601ca78688c47d96a73a7133ce7c"} Dec 11 05:17:19 crc kubenswrapper[4628]: E1211 05:17:19.091899 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 11 05:17:19 crc kubenswrapper[4628]: E1211 05:17:19.092244 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h266r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hc475_openshift-marketplace(fa15b6c3-774f-4d31-8b55-008c3786d329): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 05:17:19 crc kubenswrapper[4628]: E1211 05:17:19.093515 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hc475" podUID="fa15b6c3-774f-4d31-8b55-008c3786d329" Dec 11 05:17:19 crc kubenswrapper[4628]: I1211 05:17:19.681614 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"843a2bbd-6914-4686-a14e-f05f88ddcc07","Type":"ContainerStarted","Data":"b7235605250fb4de4008c27b31113cf26e3f35f9166ef4c21f46118f2bb45295"} Dec 11 05:17:19 crc kubenswrapper[4628]: I1211 05:17:19.683875 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5c556f8d-9095-4617-beb5-a9fdc918a365","Type":"ContainerStarted","Data":"6f3f25ad0e07ca4f1435df730bb04bde185bb6abd12b6c4b7ad375681e115513"} Dec 11 05:17:19 crc kubenswrapper[4628]: E1211 05:17:19.686625 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hc475" podUID="fa15b6c3-774f-4d31-8b55-008c3786d329" Dec 11 05:17:19 crc kubenswrapper[4628]: I1211 05:17:19.707323 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=40.707299451 podStartE2EDuration="40.707299451s" podCreationTimestamp="2025-12-11 05:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:17:19.700932895 +0000 UTC m=+142.118279633" watchObservedRunningTime="2025-12-11 05:17:19.707299451 +0000 UTC m=+142.124646179" Dec 11 05:17:20 crc kubenswrapper[4628]: E1211 05:17:20.231686 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 11 05:17:20 crc kubenswrapper[4628]: E1211 05:17:20.232100 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2h92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mjcbm_openshift-marketplace(2a9eb6ef-92ff-415b-a526-26711b88985f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 05:17:20 crc kubenswrapper[4628]: E1211 05:17:20.233356 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-mjcbm" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" Dec 11 05:17:20 crc kubenswrapper[4628]: I1211 05:17:20.695190 4628 generic.go:334] "Generic (PLEG): container finished" podID="5c556f8d-9095-4617-beb5-a9fdc918a365" containerID="6f3f25ad0e07ca4f1435df730bb04bde185bb6abd12b6c4b7ad375681e115513" exitCode=0 Dec 11 05:17:20 crc kubenswrapper[4628]: I1211 05:17:20.695288 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5c556f8d-9095-4617-beb5-a9fdc918a365","Type":"ContainerDied","Data":"6f3f25ad0e07ca4f1435df730bb04bde185bb6abd12b6c4b7ad375681e115513"} Dec 11 05:17:20 crc kubenswrapper[4628]: E1211 05:17:20.698418 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-mjcbm" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.036918 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.119216 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c556f8d-9095-4617-beb5-a9fdc918a365-kube-api-access\") pod \"5c556f8d-9095-4617-beb5-a9fdc918a365\" (UID: \"5c556f8d-9095-4617-beb5-a9fdc918a365\") " Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.119482 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c556f8d-9095-4617-beb5-a9fdc918a365-kubelet-dir\") pod \"5c556f8d-9095-4617-beb5-a9fdc918a365\" (UID: \"5c556f8d-9095-4617-beb5-a9fdc918a365\") " Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.119536 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c556f8d-9095-4617-beb5-a9fdc918a365-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5c556f8d-9095-4617-beb5-a9fdc918a365" (UID: "5c556f8d-9095-4617-beb5-a9fdc918a365"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.120004 4628 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c556f8d-9095-4617-beb5-a9fdc918a365-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.132743 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c556f8d-9095-4617-beb5-a9fdc918a365-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5c556f8d-9095-4617-beb5-a9fdc918a365" (UID: "5c556f8d-9095-4617-beb5-a9fdc918a365"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.220812 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c556f8d-9095-4617-beb5-a9fdc918a365-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.709143 4628 generic.go:334] "Generic (PLEG): container finished" podID="eeedd8fe-1e9a-4009-a385-07d72fed1277" containerID="27e14cf1997491c2be8267060246a6688618f93c19a3b79e88a5c74b708b126a" exitCode=0 Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.709233 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbrk4" event={"ID":"eeedd8fe-1e9a-4009-a385-07d72fed1277","Type":"ContainerDied","Data":"27e14cf1997491c2be8267060246a6688618f93c19a3b79e88a5c74b708b126a"} Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.712207 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5c556f8d-9095-4617-beb5-a9fdc918a365","Type":"ContainerDied","Data":"93a57479ee39e0ba0cb70b0bc3e73cde3d0f601ca78688c47d96a73a7133ce7c"} Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.712279 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93a57479ee39e0ba0cb70b0bc3e73cde3d0f601ca78688c47d96a73a7133ce7c" Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.712283 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.716451 4628 generic.go:334] "Generic (PLEG): container finished" podID="445e77bd-611f-486b-af50-16e4476e29e4" containerID="728fed19e4d96090f7d567c888d11fec091d97157d2c1099c013200bd7b964ec" exitCode=0 Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.716562 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjbgr" event={"ID":"445e77bd-611f-486b-af50-16e4476e29e4","Type":"ContainerDied","Data":"728fed19e4d96090f7d567c888d11fec091d97157d2c1099c013200bd7b964ec"} Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.766877 4628 generic.go:334] "Generic (PLEG): container finished" podID="3953bb62-252a-4109-9ed9-f14294565e1d" containerID="2c34092b7acf9cc52c87ec6531e11b349d4a34dda1f1d1211c5bd523e400f08d" exitCode=0 Dec 11 05:17:22 crc kubenswrapper[4628]: I1211 05:17:22.766917 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nrc6p" event={"ID":"3953bb62-252a-4109-9ed9-f14294565e1d","Type":"ContainerDied","Data":"2c34092b7acf9cc52c87ec6531e11b349d4a34dda1f1d1211c5bd523e400f08d"} Dec 11 05:17:23 crc kubenswrapper[4628]: I1211 05:17:23.784354 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjbgr" event={"ID":"445e77bd-611f-486b-af50-16e4476e29e4","Type":"ContainerStarted","Data":"c402e0a0353425ce6902e2c03863aa1f2b70a31113d19f8adf371a76d50d4371"} Dec 11 05:17:23 crc kubenswrapper[4628]: I1211 05:17:23.789538 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nrc6p" event={"ID":"3953bb62-252a-4109-9ed9-f14294565e1d","Type":"ContainerStarted","Data":"af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f"} Dec 11 05:17:23 crc kubenswrapper[4628]: I1211 05:17:23.795352 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbrk4" event={"ID":"eeedd8fe-1e9a-4009-a385-07d72fed1277","Type":"ContainerStarted","Data":"a2321030527cd26a6ef15e2ea3e3407f2f48c539b82fa8ce81f3facb6965e2f4"} Dec 11 05:17:23 crc kubenswrapper[4628]: I1211 05:17:23.804718 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hjbgr" podStartSLOduration=2.846011944 podStartE2EDuration="1m27.804699323s" podCreationTimestamp="2025-12-11 05:15:56 +0000 UTC" firstStartedPulling="2025-12-11 05:15:58.305241454 +0000 UTC m=+60.722588152" lastFinishedPulling="2025-12-11 05:17:23.263928793 +0000 UTC m=+145.681275531" observedRunningTime="2025-12-11 05:17:23.802580564 +0000 UTC m=+146.219927262" watchObservedRunningTime="2025-12-11 05:17:23.804699323 +0000 UTC m=+146.222046021" Dec 11 05:17:23 crc kubenswrapper[4628]: I1211 05:17:23.823435 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nrc6p" podStartSLOduration=4.103349541 podStartE2EDuration="1m25.823415952s" podCreationTimestamp="2025-12-11 05:15:58 +0000 UTC" firstStartedPulling="2025-12-11 05:16:01.794012745 +0000 UTC m=+64.211359443" lastFinishedPulling="2025-12-11 05:17:23.514079166 +0000 UTC m=+145.931425854" observedRunningTime="2025-12-11 05:17:23.821079278 +0000 UTC m=+146.238425976" watchObservedRunningTime="2025-12-11 05:17:23.823415952 +0000 UTC m=+146.240762650" Dec 11 05:17:23 crc kubenswrapper[4628]: I1211 05:17:23.841678 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rbrk4" podStartSLOduration=4.159545953 podStartE2EDuration="1m25.841660299s" podCreationTimestamp="2025-12-11 05:15:58 +0000 UTC" firstStartedPulling="2025-12-11 05:16:01.810954876 +0000 UTC m=+64.228301574" lastFinishedPulling="2025-12-11 05:17:23.493069222 +0000 UTC m=+145.910415920" observedRunningTime="2025-12-11 05:17:23.841284349 +0000 UTC m=+146.258631047" watchObservedRunningTime="2025-12-11 05:17:23.841660299 +0000 UTC m=+146.259006997" Dec 11 05:17:26 crc kubenswrapper[4628]: I1211 05:17:26.653384 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" podUID="a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" containerName="oauth-openshift" containerID="cri-o://d8e1961a7e7644e5ce3bf748c250af80bbbe86b73756cba30cfead0c42fee1e1" gracePeriod=15 Dec 11 05:17:26 crc kubenswrapper[4628]: I1211 05:17:26.727499 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:17:26 crc kubenswrapper[4628]: I1211 05:17:26.728521 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:17:27 crc kubenswrapper[4628]: I1211 05:17:27.059164 4628 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wg98b container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Dec 11 05:17:27 crc kubenswrapper[4628]: I1211 05:17:27.059220 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" podUID="a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Dec 11 05:17:27 crc kubenswrapper[4628]: I1211 05:17:27.091583 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:17:27 crc kubenswrapper[4628]: I1211 05:17:27.825767 4628 generic.go:334] "Generic (PLEG): container finished" podID="a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" containerID="d8e1961a7e7644e5ce3bf748c250af80bbbe86b73756cba30cfead0c42fee1e1" exitCode=0 Dec 11 05:17:27 crc kubenswrapper[4628]: I1211 05:17:27.826374 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" event={"ID":"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb","Type":"ContainerDied","Data":"d8e1961a7e7644e5ce3bf748c250af80bbbe86b73756cba30cfead0c42fee1e1"} Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.422374 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.466032 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7d8577449b-kh6m7"] Dec 11 05:17:28 crc kubenswrapper[4628]: E1211 05:17:28.466368 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eefadbf-ac92-4b97-999e-fb262b5d45c2" containerName="kube-multus-additional-cni-plugins" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.466387 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eefadbf-ac92-4b97-999e-fb262b5d45c2" containerName="kube-multus-additional-cni-plugins" Dec 11 05:17:28 crc kubenswrapper[4628]: E1211 05:17:28.466408 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c556f8d-9095-4617-beb5-a9fdc918a365" containerName="pruner" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.466420 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c556f8d-9095-4617-beb5-a9fdc918a365" containerName="pruner" Dec 11 05:17:28 crc kubenswrapper[4628]: E1211 05:17:28.466435 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" containerName="oauth-openshift" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.466449 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" containerName="oauth-openshift" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.466621 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c556f8d-9095-4617-beb5-a9fdc918a365" containerName="pruner" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.466641 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eefadbf-ac92-4b97-999e-fb262b5d45c2" containerName="kube-multus-additional-cni-plugins" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.466660 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" containerName="oauth-openshift" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.467241 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521274 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-service-ca\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521360 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mncrp\" (UniqueName: \"kubernetes.io/projected/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-kube-api-access-mncrp\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521396 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-error\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521454 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-trusted-ca-bundle\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521514 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-serving-cert\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521603 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-ocp-branding-template\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521637 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-cliconfig\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521663 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-session\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521691 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-router-certs\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521724 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-audit-dir\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521782 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-login\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521809 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-idp-0-file-data\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521841 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-audit-policies\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.521906 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-provider-selection\") pod \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\" (UID: \"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb\") " Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.522009 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.522289 4628 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.522750 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.523222 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.523297 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.523475 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.530300 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.534566 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.534615 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-kube-api-access-mncrp" (OuterVolumeSpecName: "kube-api-access-mncrp") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "kube-api-access-mncrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.535358 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7d8577449b-kh6m7"] Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.539638 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.540293 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.545539 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.550582 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.550923 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.553126 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" (UID: "a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623227 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-user-template-error\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623288 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67111633-8492-4db8-be4f-962a92896b19-audit-dir\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623330 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xngj4\" (UniqueName: \"kubernetes.io/projected/67111633-8492-4db8-be4f-962a92896b19-kube-api-access-xngj4\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623366 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623400 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623454 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67111633-8492-4db8-be4f-962a92896b19-audit-policies\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623485 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623510 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-session\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623560 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623586 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623609 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623635 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623665 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623692 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-user-template-login\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623739 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623757 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623772 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623794 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623909 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623944 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623960 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623975 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.623990 4628 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.624030 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.624051 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.624068 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mncrp\" (UniqueName: \"kubernetes.io/projected/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-kube-api-access-mncrp\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.624080 4628 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.725550 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.725631 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.725709 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.725754 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.726898 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.726959 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-user-template-login\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.727067 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-user-template-error\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.727372 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.728348 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.728841 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67111633-8492-4db8-be4f-962a92896b19-audit-dir\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.729029 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xngj4\" (UniqueName: \"kubernetes.io/projected/67111633-8492-4db8-be4f-962a92896b19-kube-api-access-xngj4\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.729095 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.729148 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.729158 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67111633-8492-4db8-be4f-962a92896b19-audit-dir\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.729187 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67111633-8492-4db8-be4f-962a92896b19-audit-policies\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.729243 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.729272 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-session\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.730134 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.730375 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67111633-8492-4db8-be4f-962a92896b19-audit-policies\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.730590 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.730750 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-user-template-login\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.731491 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.732139 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.732389 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-user-template-error\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.732617 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-session\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.735074 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.735147 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/67111633-8492-4db8-be4f-962a92896b19-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.759001 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xngj4\" (UniqueName: \"kubernetes.io/projected/67111633-8492-4db8-be4f-962a92896b19-kube-api-access-xngj4\") pod \"oauth-openshift-7d8577449b-kh6m7\" (UID: \"67111633-8492-4db8-be4f-962a92896b19\") " pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.791128 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.836357 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7hq5" event={"ID":"b719f066-3997-4c70-bfb1-b489c56e2ef4","Type":"ContainerStarted","Data":"09e3e7295d1cbf35a5c0ae88ec80fded60f7f8e7a6e2f23d44d22c67040cc507"} Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.840438 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" event={"ID":"a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb","Type":"ContainerDied","Data":"33a4cc2d8be09644a4a333c502c4103ffca7e1af9575fd8afe0f1161dab11cac"} Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.840525 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wg98b" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.840562 4628 scope.go:117] "RemoveContainer" containerID="d8e1961a7e7644e5ce3bf748c250af80bbbe86b73756cba30cfead0c42fee1e1" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.851220 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.851274 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.893474 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wg98b"] Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.894944 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wg98b"] Dec 11 05:17:28 crc kubenswrapper[4628]: I1211 05:17:28.930756 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:17:29 crc kubenswrapper[4628]: I1211 05:17:29.281671 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7d8577449b-kh6m7"] Dec 11 05:17:29 crc kubenswrapper[4628]: W1211 05:17:29.295988 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67111633_8492_4db8_be4f_962a92896b19.slice/crio-68b6fc5f0f09261c48010f431ceeaa00332314c2ca737d4d94b200e54bda6414 WatchSource:0}: Error finding container 68b6fc5f0f09261c48010f431ceeaa00332314c2ca737d4d94b200e54bda6414: Status 404 returned error can't find the container with id 68b6fc5f0f09261c48010f431ceeaa00332314c2ca737d4d94b200e54bda6414 Dec 11 05:17:29 crc kubenswrapper[4628]: I1211 05:17:29.495798 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:17:29 crc kubenswrapper[4628]: I1211 05:17:29.495973 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:17:29 crc kubenswrapper[4628]: I1211 05:17:29.568283 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:17:29 crc kubenswrapper[4628]: I1211 05:17:29.852685 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" event={"ID":"67111633-8492-4db8-be4f-962a92896b19","Type":"ContainerStarted","Data":"68b6fc5f0f09261c48010f431ceeaa00332314c2ca737d4d94b200e54bda6414"} Dec 11 05:17:29 crc kubenswrapper[4628]: I1211 05:17:29.912258 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb" path="/var/lib/kubelet/pods/a41f1eba-16e8-4fb6-a0dd-5e3cd76b7fbb/volumes" Dec 11 05:17:29 crc kubenswrapper[4628]: I1211 05:17:29.942355 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:17:29 crc kubenswrapper[4628]: I1211 05:17:29.962535 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:17:31 crc kubenswrapper[4628]: I1211 05:17:31.870387 4628 generic.go:334] "Generic (PLEG): container finished" podID="b719f066-3997-4c70-bfb1-b489c56e2ef4" containerID="09e3e7295d1cbf35a5c0ae88ec80fded60f7f8e7a6e2f23d44d22c67040cc507" exitCode=0 Dec 11 05:17:31 crc kubenswrapper[4628]: I1211 05:17:31.870509 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7hq5" event={"ID":"b719f066-3997-4c70-bfb1-b489c56e2ef4","Type":"ContainerDied","Data":"09e3e7295d1cbf35a5c0ae88ec80fded60f7f8e7a6e2f23d44d22c67040cc507"} Dec 11 05:17:32 crc kubenswrapper[4628]: I1211 05:17:32.877811 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" event={"ID":"67111633-8492-4db8-be4f-962a92896b19","Type":"ContainerStarted","Data":"5c33e74e9d4e44de539234f6c652590c6468771608daafc1d651918dbab8e9fd"} Dec 11 05:17:32 crc kubenswrapper[4628]: I1211 05:17:32.878226 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:32 crc kubenswrapper[4628]: I1211 05:17:32.880088 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcx8h" event={"ID":"a27ccd53-7ae0-4c98-9461-b545f841ea79","Type":"ContainerStarted","Data":"450f202379c71d93ad9aaccb996e9dcb4a0258cf83bb420f1fb11ab2a44654be"} Dec 11 05:17:32 crc kubenswrapper[4628]: I1211 05:17:32.900530 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" podStartSLOduration=31.900506546 podStartE2EDuration="31.900506546s" podCreationTimestamp="2025-12-11 05:17:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:17:32.900138574 +0000 UTC m=+155.317485262" watchObservedRunningTime="2025-12-11 05:17:32.900506546 +0000 UTC m=+155.317853254" Dec 11 05:17:32 crc kubenswrapper[4628]: I1211 05:17:32.936643 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nrc6p"] Dec 11 05:17:32 crc kubenswrapper[4628]: I1211 05:17:32.937158 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nrc6p" podUID="3953bb62-252a-4109-9ed9-f14294565e1d" containerName="registry-server" containerID="cri-o://af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f" gracePeriod=2 Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.299806 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.410278 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh82m\" (UniqueName: \"kubernetes.io/projected/3953bb62-252a-4109-9ed9-f14294565e1d-kube-api-access-bh82m\") pod \"3953bb62-252a-4109-9ed9-f14294565e1d\" (UID: \"3953bb62-252a-4109-9ed9-f14294565e1d\") " Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.410381 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3953bb62-252a-4109-9ed9-f14294565e1d-utilities\") pod \"3953bb62-252a-4109-9ed9-f14294565e1d\" (UID: \"3953bb62-252a-4109-9ed9-f14294565e1d\") " Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.410447 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3953bb62-252a-4109-9ed9-f14294565e1d-catalog-content\") pod \"3953bb62-252a-4109-9ed9-f14294565e1d\" (UID: \"3953bb62-252a-4109-9ed9-f14294565e1d\") " Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.411036 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3953bb62-252a-4109-9ed9-f14294565e1d-utilities" (OuterVolumeSpecName: "utilities") pod "3953bb62-252a-4109-9ed9-f14294565e1d" (UID: "3953bb62-252a-4109-9ed9-f14294565e1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.416045 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3953bb62-252a-4109-9ed9-f14294565e1d-kube-api-access-bh82m" (OuterVolumeSpecName: "kube-api-access-bh82m") pod "3953bb62-252a-4109-9ed9-f14294565e1d" (UID: "3953bb62-252a-4109-9ed9-f14294565e1d"). InnerVolumeSpecName "kube-api-access-bh82m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.446766 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3953bb62-252a-4109-9ed9-f14294565e1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3953bb62-252a-4109-9ed9-f14294565e1d" (UID: "3953bb62-252a-4109-9ed9-f14294565e1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.450826 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7d8577449b-kh6m7" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.511558 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh82m\" (UniqueName: \"kubernetes.io/projected/3953bb62-252a-4109-9ed9-f14294565e1d-kube-api-access-bh82m\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.511605 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3953bb62-252a-4109-9ed9-f14294565e1d-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.511619 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3953bb62-252a-4109-9ed9-f14294565e1d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.890581 4628 generic.go:334] "Generic (PLEG): container finished" podID="3953bb62-252a-4109-9ed9-f14294565e1d" containerID="af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f" exitCode=0 Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.890745 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nrc6p" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.892795 4628 generic.go:334] "Generic (PLEG): container finished" podID="fa15b6c3-774f-4d31-8b55-008c3786d329" containerID="edcbc9945c5b08a4077e337f5641e44cdf35c924d57a71af42326afb6b9754df" exitCode=0 Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.898461 4628 generic.go:334] "Generic (PLEG): container finished" podID="a27ccd53-7ae0-4c98-9461-b545f841ea79" containerID="450f202379c71d93ad9aaccb996e9dcb4a0258cf83bb420f1fb11ab2a44654be" exitCode=0 Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.915665 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nrc6p" event={"ID":"3953bb62-252a-4109-9ed9-f14294565e1d","Type":"ContainerDied","Data":"af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f"} Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.915715 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nrc6p" event={"ID":"3953bb62-252a-4109-9ed9-f14294565e1d","Type":"ContainerDied","Data":"485442daadc69375a5d13f5ba7339d029c890896988c5167d70efe4e4efe38f5"} Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.915732 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc475" event={"ID":"fa15b6c3-774f-4d31-8b55-008c3786d329","Type":"ContainerDied","Data":"edcbc9945c5b08a4077e337f5641e44cdf35c924d57a71af42326afb6b9754df"} Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.915745 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7hq5" event={"ID":"b719f066-3997-4c70-bfb1-b489c56e2ef4","Type":"ContainerStarted","Data":"23330347c5b752aa9a032ac194066e0177bb02003625813f885fdfa4c7f106a8"} Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.915757 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcx8h" event={"ID":"a27ccd53-7ae0-4c98-9461-b545f841ea79","Type":"ContainerDied","Data":"450f202379c71d93ad9aaccb996e9dcb4a0258cf83bb420f1fb11ab2a44654be"} Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.916709 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k7hq5" podStartSLOduration=3.081144227 podStartE2EDuration="1m37.916684311s" podCreationTimestamp="2025-12-11 05:15:56 +0000 UTC" firstStartedPulling="2025-12-11 05:15:58.308329446 +0000 UTC m=+60.725676144" lastFinishedPulling="2025-12-11 05:17:33.14386953 +0000 UTC m=+155.561216228" observedRunningTime="2025-12-11 05:17:33.91482852 +0000 UTC m=+156.332175208" watchObservedRunningTime="2025-12-11 05:17:33.916684311 +0000 UTC m=+156.334031019" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.917198 4628 scope.go:117] "RemoveContainer" containerID="af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.951026 4628 scope.go:117] "RemoveContainer" containerID="2c34092b7acf9cc52c87ec6531e11b349d4a34dda1f1d1211c5bd523e400f08d" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.959086 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nrc6p"] Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.963001 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nrc6p"] Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.981376 4628 scope.go:117] "RemoveContainer" containerID="0fd98bd50b282ee42ebc696d1287d539b8df5fb99f2ec1ebfdcb7fc971cfdd6b" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.994953 4628 scope.go:117] "RemoveContainer" containerID="af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f" Dec 11 05:17:33 crc kubenswrapper[4628]: E1211 05:17:33.996963 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f\": container with ID starting with af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f not found: ID does not exist" containerID="af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.997012 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f"} err="failed to get container status \"af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f\": rpc error: code = NotFound desc = could not find container \"af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f\": container with ID starting with af5892c5ffe4f785166ea9d140df5821a2d93277128b0dd82511e4481a7c0a8f not found: ID does not exist" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.997065 4628 scope.go:117] "RemoveContainer" containerID="2c34092b7acf9cc52c87ec6531e11b349d4a34dda1f1d1211c5bd523e400f08d" Dec 11 05:17:33 crc kubenswrapper[4628]: E1211 05:17:33.997388 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c34092b7acf9cc52c87ec6531e11b349d4a34dda1f1d1211c5bd523e400f08d\": container with ID starting with 2c34092b7acf9cc52c87ec6531e11b349d4a34dda1f1d1211c5bd523e400f08d not found: ID does not exist" containerID="2c34092b7acf9cc52c87ec6531e11b349d4a34dda1f1d1211c5bd523e400f08d" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.997422 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c34092b7acf9cc52c87ec6531e11b349d4a34dda1f1d1211c5bd523e400f08d"} err="failed to get container status \"2c34092b7acf9cc52c87ec6531e11b349d4a34dda1f1d1211c5bd523e400f08d\": rpc error: code = NotFound desc = could not find container \"2c34092b7acf9cc52c87ec6531e11b349d4a34dda1f1d1211c5bd523e400f08d\": container with ID starting with 2c34092b7acf9cc52c87ec6531e11b349d4a34dda1f1d1211c5bd523e400f08d not found: ID does not exist" Dec 11 05:17:33 crc kubenswrapper[4628]: I1211 05:17:33.997460 4628 scope.go:117] "RemoveContainer" containerID="0fd98bd50b282ee42ebc696d1287d539b8df5fb99f2ec1ebfdcb7fc971cfdd6b" Dec 11 05:17:34 crc kubenswrapper[4628]: E1211 05:17:34.000171 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fd98bd50b282ee42ebc696d1287d539b8df5fb99f2ec1ebfdcb7fc971cfdd6b\": container with ID starting with 0fd98bd50b282ee42ebc696d1287d539b8df5fb99f2ec1ebfdcb7fc971cfdd6b not found: ID does not exist" containerID="0fd98bd50b282ee42ebc696d1287d539b8df5fb99f2ec1ebfdcb7fc971cfdd6b" Dec 11 05:17:34 crc kubenswrapper[4628]: I1211 05:17:34.000217 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fd98bd50b282ee42ebc696d1287d539b8df5fb99f2ec1ebfdcb7fc971cfdd6b"} err="failed to get container status \"0fd98bd50b282ee42ebc696d1287d539b8df5fb99f2ec1ebfdcb7fc971cfdd6b\": rpc error: code = NotFound desc = could not find container \"0fd98bd50b282ee42ebc696d1287d539b8df5fb99f2ec1ebfdcb7fc971cfdd6b\": container with ID starting with 0fd98bd50b282ee42ebc696d1287d539b8df5fb99f2ec1ebfdcb7fc971cfdd6b not found: ID does not exist" Dec 11 05:17:35 crc kubenswrapper[4628]: I1211 05:17:35.903441 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3953bb62-252a-4109-9ed9-f14294565e1d" path="/var/lib/kubelet/pods/3953bb62-252a-4109-9ed9-f14294565e1d/volumes" Dec 11 05:17:36 crc kubenswrapper[4628]: I1211 05:17:36.787517 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:17:36 crc kubenswrapper[4628]: I1211 05:17:36.910383 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:17:36 crc kubenswrapper[4628]: I1211 05:17:36.911790 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:17:36 crc kubenswrapper[4628]: I1211 05:17:36.974188 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:17:39 crc kubenswrapper[4628]: I1211 05:17:39.929588 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-rbrk4" podUID="eeedd8fe-1e9a-4009-a385-07d72fed1277" containerName="registry-server" probeResult="failure" output=< Dec 11 05:17:39 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 05:17:39 crc kubenswrapper[4628]: > Dec 11 05:17:46 crc kubenswrapper[4628]: I1211 05:17:46.953585 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:17:46 crc kubenswrapper[4628]: I1211 05:17:46.995203 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k7hq5"] Dec 11 05:17:46 crc kubenswrapper[4628]: I1211 05:17:46.995383 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k7hq5" podUID="b719f066-3997-4c70-bfb1-b489c56e2ef4" containerName="registry-server" containerID="cri-o://23330347c5b752aa9a032ac194066e0177bb02003625813f885fdfa4c7f106a8" gracePeriod=2 Dec 11 05:17:47 crc kubenswrapper[4628]: I1211 05:17:47.994657 4628 generic.go:334] "Generic (PLEG): container finished" podID="b719f066-3997-4c70-bfb1-b489c56e2ef4" containerID="23330347c5b752aa9a032ac194066e0177bb02003625813f885fdfa4c7f106a8" exitCode=0 Dec 11 05:17:47 crc kubenswrapper[4628]: I1211 05:17:47.994742 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7hq5" event={"ID":"b719f066-3997-4c70-bfb1-b489c56e2ef4","Type":"ContainerDied","Data":"23330347c5b752aa9a032ac194066e0177bb02003625813f885fdfa4c7f106a8"} Dec 11 05:17:47 crc kubenswrapper[4628]: I1211 05:17:47.997226 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcx8h" event={"ID":"a27ccd53-7ae0-4c98-9461-b545f841ea79","Type":"ContainerStarted","Data":"8ade6dee0b74cc16615e80d499782062e0cd15d0e279a9e2c159df3ac099dc36"} Dec 11 05:17:47 crc kubenswrapper[4628]: I1211 05:17:47.999950 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc475" event={"ID":"fa15b6c3-774f-4d31-8b55-008c3786d329","Type":"ContainerStarted","Data":"d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6"} Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.002603 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m4g7" event={"ID":"3d5f88ec-256a-4556-b06b-814dfa23c87b","Type":"ContainerStarted","Data":"fdb3e89ebd26eb687fd75a78fede0debf168ad8f4a150ad54db295ae82af548e"} Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.013435 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjcbm" event={"ID":"2a9eb6ef-92ff-415b-a526-26711b88985f","Type":"ContainerStarted","Data":"385e6d17700b0b38ac74626d67f6cdb695c2bfdf6c08b48ab5ffba588eac39bf"} Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.053835 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dcx8h" podStartSLOduration=8.386220686 podStartE2EDuration="1m53.053808306s" podCreationTimestamp="2025-12-11 05:15:55 +0000 UTC" firstStartedPulling="2025-12-11 05:16:00.631123602 +0000 UTC m=+63.048470300" lastFinishedPulling="2025-12-11 05:17:45.298711222 +0000 UTC m=+167.716057920" observedRunningTime="2025-12-11 05:17:48.020370748 +0000 UTC m=+170.437717446" watchObservedRunningTime="2025-12-11 05:17:48.053808306 +0000 UTC m=+170.471155014" Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.308738 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.367774 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b719f066-3997-4c70-bfb1-b489c56e2ef4-catalog-content\") pod \"b719f066-3997-4c70-bfb1-b489c56e2ef4\" (UID: \"b719f066-3997-4c70-bfb1-b489c56e2ef4\") " Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.367826 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b719f066-3997-4c70-bfb1-b489c56e2ef4-utilities\") pod \"b719f066-3997-4c70-bfb1-b489c56e2ef4\" (UID: \"b719f066-3997-4c70-bfb1-b489c56e2ef4\") " Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.367892 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hqpf\" (UniqueName: \"kubernetes.io/projected/b719f066-3997-4c70-bfb1-b489c56e2ef4-kube-api-access-8hqpf\") pod \"b719f066-3997-4c70-bfb1-b489c56e2ef4\" (UID: \"b719f066-3997-4c70-bfb1-b489c56e2ef4\") " Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.368562 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b719f066-3997-4c70-bfb1-b489c56e2ef4-utilities" (OuterVolumeSpecName: "utilities") pod "b719f066-3997-4c70-bfb1-b489c56e2ef4" (UID: "b719f066-3997-4c70-bfb1-b489c56e2ef4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.376789 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b719f066-3997-4c70-bfb1-b489c56e2ef4-kube-api-access-8hqpf" (OuterVolumeSpecName: "kube-api-access-8hqpf") pod "b719f066-3997-4c70-bfb1-b489c56e2ef4" (UID: "b719f066-3997-4c70-bfb1-b489c56e2ef4"). InnerVolumeSpecName "kube-api-access-8hqpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.438150 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b719f066-3997-4c70-bfb1-b489c56e2ef4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b719f066-3997-4c70-bfb1-b489c56e2ef4" (UID: "b719f066-3997-4c70-bfb1-b489c56e2ef4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.476215 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b719f066-3997-4c70-bfb1-b489c56e2ef4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.476249 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b719f066-3997-4c70-bfb1-b489c56e2ef4-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.476262 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hqpf\" (UniqueName: \"kubernetes.io/projected/b719f066-3997-4c70-bfb1-b489c56e2ef4-kube-api-access-8hqpf\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.982309 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7m4g7"] Dec 11 05:17:48 crc kubenswrapper[4628]: I1211 05:17:48.995810 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dcx8h"] Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.000380 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hjbgr"] Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.000712 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hjbgr" podUID="445e77bd-611f-486b-af50-16e4476e29e4" containerName="registry-server" containerID="cri-o://c402e0a0353425ce6902e2c03863aa1f2b70a31113d19f8adf371a76d50d4371" gracePeriod=30 Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.017350 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6vrgl"] Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.017538 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" podUID="ce2358a9-7f38-41e7-ba92-b82e8e98b458" containerName="marketplace-operator" containerID="cri-o://0924ac6e1d31b618731ffe5cff5071db9f2e1b0f1e54a28173a4d10a18fca739" gracePeriod=30 Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.032969 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbrk4"] Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.033221 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rbrk4" podUID="eeedd8fe-1e9a-4009-a385-07d72fed1277" containerName="registry-server" containerID="cri-o://a2321030527cd26a6ef15e2ea3e3407f2f48c539b82fa8ce81f3facb6965e2f4" gracePeriod=30 Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.036183 4628 generic.go:334] "Generic (PLEG): container finished" podID="3d5f88ec-256a-4556-b06b-814dfa23c87b" containerID="fdb3e89ebd26eb687fd75a78fede0debf168ad8f4a150ad54db295ae82af548e" exitCode=0 Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.036367 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m4g7" event={"ID":"3d5f88ec-256a-4556-b06b-814dfa23c87b","Type":"ContainerDied","Data":"fdb3e89ebd26eb687fd75a78fede0debf168ad8f4a150ad54db295ae82af548e"} Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.048980 4628 generic.go:334] "Generic (PLEG): container finished" podID="2a9eb6ef-92ff-415b-a526-26711b88985f" containerID="385e6d17700b0b38ac74626d67f6cdb695c2bfdf6c08b48ab5ffba588eac39bf" exitCode=0 Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.049083 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjcbm" event={"ID":"2a9eb6ef-92ff-415b-a526-26711b88985f","Type":"ContainerDied","Data":"385e6d17700b0b38ac74626d67f6cdb695c2bfdf6c08b48ab5ffba588eac39bf"} Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.050823 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hc475"] Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.053720 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8wvwr"] Dec 11 05:17:49 crc kubenswrapper[4628]: E1211 05:17:49.054139 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3953bb62-252a-4109-9ed9-f14294565e1d" containerName="registry-server" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.054239 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="3953bb62-252a-4109-9ed9-f14294565e1d" containerName="registry-server" Dec 11 05:17:49 crc kubenswrapper[4628]: E1211 05:17:49.054326 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b719f066-3997-4c70-bfb1-b489c56e2ef4" containerName="extract-utilities" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.054394 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="b719f066-3997-4c70-bfb1-b489c56e2ef4" containerName="extract-utilities" Dec 11 05:17:49 crc kubenswrapper[4628]: E1211 05:17:49.054462 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3953bb62-252a-4109-9ed9-f14294565e1d" containerName="extract-utilities" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.054520 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="3953bb62-252a-4109-9ed9-f14294565e1d" containerName="extract-utilities" Dec 11 05:17:49 crc kubenswrapper[4628]: E1211 05:17:49.054582 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3953bb62-252a-4109-9ed9-f14294565e1d" containerName="extract-content" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.054632 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="3953bb62-252a-4109-9ed9-f14294565e1d" containerName="extract-content" Dec 11 05:17:49 crc kubenswrapper[4628]: E1211 05:17:49.054685 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b719f066-3997-4c70-bfb1-b489c56e2ef4" containerName="extract-content" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.054797 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="b719f066-3997-4c70-bfb1-b489c56e2ef4" containerName="extract-content" Dec 11 05:17:49 crc kubenswrapper[4628]: E1211 05:17:49.054873 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b719f066-3997-4c70-bfb1-b489c56e2ef4" containerName="registry-server" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.054926 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="b719f066-3997-4c70-bfb1-b489c56e2ef4" containerName="registry-server" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.055066 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="b719f066-3997-4c70-bfb1-b489c56e2ef4" containerName="registry-server" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.055284 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="3953bb62-252a-4109-9ed9-f14294565e1d" containerName="registry-server" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.057441 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.064964 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mjcbm"] Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.073050 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8wvwr"] Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.082279 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dcx8h" podUID="a27ccd53-7ae0-4c98-9461-b545f841ea79" containerName="registry-server" containerID="cri-o://8ade6dee0b74cc16615e80d499782062e0cd15d0e279a9e2c159df3ac099dc36" gracePeriod=30 Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.082609 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7hq5" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.084599 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7hq5" event={"ID":"b719f066-3997-4c70-bfb1-b489c56e2ef4","Type":"ContainerDied","Data":"87beb35fd22927e3d7f10bdba4d317a63288b0660668dc61d5ce635a9c5adbfe"} Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.084654 4628 scope.go:117] "RemoveContainer" containerID="23330347c5b752aa9a032ac194066e0177bb02003625813f885fdfa4c7f106a8" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.085805 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7683eae0-a7bd-46c4-867e-b15d65fc5e7e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8wvwr\" (UID: \"7683eae0-a7bd-46c4-867e-b15d65fc5e7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.085966 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7683eae0-a7bd-46c4-867e-b15d65fc5e7e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8wvwr\" (UID: \"7683eae0-a7bd-46c4-867e-b15d65fc5e7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.086091 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67x5l\" (UniqueName: \"kubernetes.io/projected/7683eae0-a7bd-46c4-867e-b15d65fc5e7e-kube-api-access-67x5l\") pod \"marketplace-operator-79b997595-8wvwr\" (UID: \"7683eae0-a7bd-46c4-867e-b15d65fc5e7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.152342 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hc475" podStartSLOduration=10.06622472 podStartE2EDuration="1m50.152322877s" podCreationTimestamp="2025-12-11 05:15:59 +0000 UTC" firstStartedPulling="2025-12-11 05:16:01.783926515 +0000 UTC m=+64.201273213" lastFinishedPulling="2025-12-11 05:17:41.870024672 +0000 UTC m=+164.287371370" observedRunningTime="2025-12-11 05:17:49.124445993 +0000 UTC m=+171.541792691" watchObservedRunningTime="2025-12-11 05:17:49.152322877 +0000 UTC m=+171.569669585" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.165799 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k7hq5"] Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.170201 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k7hq5"] Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.173665 4628 scope.go:117] "RemoveContainer" containerID="09e3e7295d1cbf35a5c0ae88ec80fded60f7f8e7a6e2f23d44d22c67040cc507" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.188239 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7683eae0-a7bd-46c4-867e-b15d65fc5e7e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8wvwr\" (UID: \"7683eae0-a7bd-46c4-867e-b15d65fc5e7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.188273 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7683eae0-a7bd-46c4-867e-b15d65fc5e7e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8wvwr\" (UID: \"7683eae0-a7bd-46c4-867e-b15d65fc5e7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.188325 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67x5l\" (UniqueName: \"kubernetes.io/projected/7683eae0-a7bd-46c4-867e-b15d65fc5e7e-kube-api-access-67x5l\") pod \"marketplace-operator-79b997595-8wvwr\" (UID: \"7683eae0-a7bd-46c4-867e-b15d65fc5e7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.189750 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7683eae0-a7bd-46c4-867e-b15d65fc5e7e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8wvwr\" (UID: \"7683eae0-a7bd-46c4-867e-b15d65fc5e7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.204752 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67x5l\" (UniqueName: \"kubernetes.io/projected/7683eae0-a7bd-46c4-867e-b15d65fc5e7e-kube-api-access-67x5l\") pod \"marketplace-operator-79b997595-8wvwr\" (UID: \"7683eae0-a7bd-46c4-867e-b15d65fc5e7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.208459 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7683eae0-a7bd-46c4-867e-b15d65fc5e7e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8wvwr\" (UID: \"7683eae0-a7bd-46c4-867e-b15d65fc5e7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.227453 4628 scope.go:117] "RemoveContainer" containerID="506713efa9deab099dc4bb44882b34dc255fd7432a4cc46b0ca2084e2c8c8c15" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.322194 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.389983 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5f88ec-256a-4556-b06b-814dfa23c87b-utilities\") pod \"3d5f88ec-256a-4556-b06b-814dfa23c87b\" (UID: \"3d5f88ec-256a-4556-b06b-814dfa23c87b\") " Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.390301 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9swxk\" (UniqueName: \"kubernetes.io/projected/3d5f88ec-256a-4556-b06b-814dfa23c87b-kube-api-access-9swxk\") pod \"3d5f88ec-256a-4556-b06b-814dfa23c87b\" (UID: \"3d5f88ec-256a-4556-b06b-814dfa23c87b\") " Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.390600 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5f88ec-256a-4556-b06b-814dfa23c87b-utilities" (OuterVolumeSpecName: "utilities") pod "3d5f88ec-256a-4556-b06b-814dfa23c87b" (UID: "3d5f88ec-256a-4556-b06b-814dfa23c87b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.391033 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5f88ec-256a-4556-b06b-814dfa23c87b-catalog-content\") pod \"3d5f88ec-256a-4556-b06b-814dfa23c87b\" (UID: \"3d5f88ec-256a-4556-b06b-814dfa23c87b\") " Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.391402 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5f88ec-256a-4556-b06b-814dfa23c87b-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.392724 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5f88ec-256a-4556-b06b-814dfa23c87b-kube-api-access-9swxk" (OuterVolumeSpecName: "kube-api-access-9swxk") pod "3d5f88ec-256a-4556-b06b-814dfa23c87b" (UID: "3d5f88ec-256a-4556-b06b-814dfa23c87b"). InnerVolumeSpecName "kube-api-access-9swxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.408120 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.445840 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5f88ec-256a-4556-b06b-814dfa23c87b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d5f88ec-256a-4556-b06b-814dfa23c87b" (UID: "3d5f88ec-256a-4556-b06b-814dfa23c87b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.493565 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9swxk\" (UniqueName: \"kubernetes.io/projected/3d5f88ec-256a-4556-b06b-814dfa23c87b-kube-api-access-9swxk\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.493592 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5f88ec-256a-4556-b06b-814dfa23c87b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.629047 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8wvwr"] Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.900158 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b719f066-3997-4c70-bfb1-b489c56e2ef4" path="/var/lib/kubelet/pods/b719f066-3997-4c70-bfb1-b489c56e2ef4/volumes" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.964019 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:17:49 crc kubenswrapper[4628]: I1211 05:17:49.964084 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.090171 4628 generic.go:334] "Generic (PLEG): container finished" podID="a27ccd53-7ae0-4c98-9461-b545f841ea79" containerID="8ade6dee0b74cc16615e80d499782062e0cd15d0e279a9e2c159df3ac099dc36" exitCode=0 Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.090261 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcx8h" event={"ID":"a27ccd53-7ae0-4c98-9461-b545f841ea79","Type":"ContainerDied","Data":"8ade6dee0b74cc16615e80d499782062e0cd15d0e279a9e2c159df3ac099dc36"} Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.091948 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7m4g7" event={"ID":"3d5f88ec-256a-4556-b06b-814dfa23c87b","Type":"ContainerDied","Data":"571364e39ca5d07d374fde8e1cef9a3d8f1cf944268631788e85fdc9c34f072b"} Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.091983 4628 scope.go:117] "RemoveContainer" containerID="fdb3e89ebd26eb687fd75a78fede0debf168ad8f4a150ad54db295ae82af548e" Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.092061 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7m4g7" Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.094171 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" event={"ID":"7683eae0-a7bd-46c4-867e-b15d65fc5e7e","Type":"ContainerStarted","Data":"cc3139044f31465346a37442dbcee85f10d18c8f7c711e2c6a74a8d192e495cc"} Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.094190 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" event={"ID":"7683eae0-a7bd-46c4-867e-b15d65fc5e7e","Type":"ContainerStarted","Data":"f414215a2b8cf49cae7cf4b7d8871d6a522a89e01893eeb06d9e4c498b39e3d8"} Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.094779 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.095793 4628 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8wvwr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.095824 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" podUID="7683eae0-a7bd-46c4-867e-b15d65fc5e7e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.096603 4628 generic.go:334] "Generic (PLEG): container finished" podID="eeedd8fe-1e9a-4009-a385-07d72fed1277" containerID="a2321030527cd26a6ef15e2ea3e3407f2f48c539b82fa8ce81f3facb6965e2f4" exitCode=0 Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.096641 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbrk4" event={"ID":"eeedd8fe-1e9a-4009-a385-07d72fed1277","Type":"ContainerDied","Data":"a2321030527cd26a6ef15e2ea3e3407f2f48c539b82fa8ce81f3facb6965e2f4"} Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.097964 4628 generic.go:334] "Generic (PLEG): container finished" podID="ce2358a9-7f38-41e7-ba92-b82e8e98b458" containerID="0924ac6e1d31b618731ffe5cff5071db9f2e1b0f1e54a28173a4d10a18fca739" exitCode=0 Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.098001 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" event={"ID":"ce2358a9-7f38-41e7-ba92-b82e8e98b458","Type":"ContainerDied","Data":"0924ac6e1d31b618731ffe5cff5071db9f2e1b0f1e54a28173a4d10a18fca739"} Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.100429 4628 generic.go:334] "Generic (PLEG): container finished" podID="445e77bd-611f-486b-af50-16e4476e29e4" containerID="c402e0a0353425ce6902e2c03863aa1f2b70a31113d19f8adf371a76d50d4371" exitCode=0 Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.100662 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hc475" podUID="fa15b6c3-774f-4d31-8b55-008c3786d329" containerName="registry-server" containerID="cri-o://d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6" gracePeriod=30 Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.100709 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjbgr" event={"ID":"445e77bd-611f-486b-af50-16e4476e29e4","Type":"ContainerDied","Data":"c402e0a0353425ce6902e2c03863aa1f2b70a31113d19f8adf371a76d50d4371"} Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.118881 4628 scope.go:117] "RemoveContainer" containerID="bf54f156f958230768e0d40884ea3ba73cdb572e1bc60abe93af6ca18feec130" Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.121066 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" podStartSLOduration=1.121047746 podStartE2EDuration="1.121047746s" podCreationTimestamp="2025-12-11 05:17:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:17:50.117443236 +0000 UTC m=+172.534789934" watchObservedRunningTime="2025-12-11 05:17:50.121047746 +0000 UTC m=+172.538394454" Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.151991 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7m4g7"] Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.157389 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7m4g7"] Dec 11 05:17:50 crc kubenswrapper[4628]: I1211 05:17:50.217561 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hc475" podUID="fa15b6c3-774f-4d31-8b55-008c3786d329" containerName="registry-server" probeResult="failure" output="" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.129406 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8wvwr" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.337171 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.356403 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.376294 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.378493 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.420303 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m597z\" (UniqueName: \"kubernetes.io/projected/a27ccd53-7ae0-4c98-9461-b545f841ea79-kube-api-access-m597z\") pod \"a27ccd53-7ae0-4c98-9461-b545f841ea79\" (UID: \"a27ccd53-7ae0-4c98-9461-b545f841ea79\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.420357 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeedd8fe-1e9a-4009-a385-07d72fed1277-catalog-content\") pod \"eeedd8fe-1e9a-4009-a385-07d72fed1277\" (UID: \"eeedd8fe-1e9a-4009-a385-07d72fed1277\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.420384 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce2358a9-7f38-41e7-ba92-b82e8e98b458-marketplace-operator-metrics\") pod \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\" (UID: \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.420403 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a27ccd53-7ae0-4c98-9461-b545f841ea79-catalog-content\") pod \"a27ccd53-7ae0-4c98-9461-b545f841ea79\" (UID: \"a27ccd53-7ae0-4c98-9461-b545f841ea79\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.420427 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445e77bd-611f-486b-af50-16e4476e29e4-utilities\") pod \"445e77bd-611f-486b-af50-16e4476e29e4\" (UID: \"445e77bd-611f-486b-af50-16e4476e29e4\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.420470 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeedd8fe-1e9a-4009-a385-07d72fed1277-utilities\") pod \"eeedd8fe-1e9a-4009-a385-07d72fed1277\" (UID: \"eeedd8fe-1e9a-4009-a385-07d72fed1277\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.420494 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a27ccd53-7ae0-4c98-9461-b545f841ea79-utilities\") pod \"a27ccd53-7ae0-4c98-9461-b545f841ea79\" (UID: \"a27ccd53-7ae0-4c98-9461-b545f841ea79\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.420509 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445e77bd-611f-486b-af50-16e4476e29e4-catalog-content\") pod \"445e77bd-611f-486b-af50-16e4476e29e4\" (UID: \"445e77bd-611f-486b-af50-16e4476e29e4\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.420527 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf68p\" (UniqueName: \"kubernetes.io/projected/445e77bd-611f-486b-af50-16e4476e29e4-kube-api-access-cf68p\") pod \"445e77bd-611f-486b-af50-16e4476e29e4\" (UID: \"445e77bd-611f-486b-af50-16e4476e29e4\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.420545 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce2358a9-7f38-41e7-ba92-b82e8e98b458-marketplace-trusted-ca\") pod \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\" (UID: \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.420594 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgnh4\" (UniqueName: \"kubernetes.io/projected/eeedd8fe-1e9a-4009-a385-07d72fed1277-kube-api-access-cgnh4\") pod \"eeedd8fe-1e9a-4009-a385-07d72fed1277\" (UID: \"eeedd8fe-1e9a-4009-a385-07d72fed1277\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.420633 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nwg6\" (UniqueName: \"kubernetes.io/projected/ce2358a9-7f38-41e7-ba92-b82e8e98b458-kube-api-access-6nwg6\") pod \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\" (UID: \"ce2358a9-7f38-41e7-ba92-b82e8e98b458\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.421559 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeedd8fe-1e9a-4009-a385-07d72fed1277-utilities" (OuterVolumeSpecName: "utilities") pod "eeedd8fe-1e9a-4009-a385-07d72fed1277" (UID: "eeedd8fe-1e9a-4009-a385-07d72fed1277"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.424734 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce2358a9-7f38-41e7-ba92-b82e8e98b458-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ce2358a9-7f38-41e7-ba92-b82e8e98b458" (UID: "ce2358a9-7f38-41e7-ba92-b82e8e98b458"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.425248 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a27ccd53-7ae0-4c98-9461-b545f841ea79-utilities" (OuterVolumeSpecName: "utilities") pod "a27ccd53-7ae0-4c98-9461-b545f841ea79" (UID: "a27ccd53-7ae0-4c98-9461-b545f841ea79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.425621 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/445e77bd-611f-486b-af50-16e4476e29e4-utilities" (OuterVolumeSpecName: "utilities") pod "445e77bd-611f-486b-af50-16e4476e29e4" (UID: "445e77bd-611f-486b-af50-16e4476e29e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.434146 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/445e77bd-611f-486b-af50-16e4476e29e4-kube-api-access-cf68p" (OuterVolumeSpecName: "kube-api-access-cf68p") pod "445e77bd-611f-486b-af50-16e4476e29e4" (UID: "445e77bd-611f-486b-af50-16e4476e29e4"). InnerVolumeSpecName "kube-api-access-cf68p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.454324 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a27ccd53-7ae0-4c98-9461-b545f841ea79-kube-api-access-m597z" (OuterVolumeSpecName: "kube-api-access-m597z") pod "a27ccd53-7ae0-4c98-9461-b545f841ea79" (UID: "a27ccd53-7ae0-4c98-9461-b545f841ea79"). InnerVolumeSpecName "kube-api-access-m597z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.458989 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeedd8fe-1e9a-4009-a385-07d72fed1277-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eeedd8fe-1e9a-4009-a385-07d72fed1277" (UID: "eeedd8fe-1e9a-4009-a385-07d72fed1277"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.465336 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeedd8fe-1e9a-4009-a385-07d72fed1277-kube-api-access-cgnh4" (OuterVolumeSpecName: "kube-api-access-cgnh4") pod "eeedd8fe-1e9a-4009-a385-07d72fed1277" (UID: "eeedd8fe-1e9a-4009-a385-07d72fed1277"). InnerVolumeSpecName "kube-api-access-cgnh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.465502 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce2358a9-7f38-41e7-ba92-b82e8e98b458-kube-api-access-6nwg6" (OuterVolumeSpecName: "kube-api-access-6nwg6") pod "ce2358a9-7f38-41e7-ba92-b82e8e98b458" (UID: "ce2358a9-7f38-41e7-ba92-b82e8e98b458"). InnerVolumeSpecName "kube-api-access-6nwg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.465523 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2358a9-7f38-41e7-ba92-b82e8e98b458-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ce2358a9-7f38-41e7-ba92-b82e8e98b458" (UID: "ce2358a9-7f38-41e7-ba92-b82e8e98b458"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.492624 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/445e77bd-611f-486b-af50-16e4476e29e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "445e77bd-611f-486b-af50-16e4476e29e4" (UID: "445e77bd-611f-486b-af50-16e4476e29e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.521999 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445e77bd-611f-486b-af50-16e4476e29e4-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.522047 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeedd8fe-1e9a-4009-a385-07d72fed1277-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.522057 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a27ccd53-7ae0-4c98-9461-b545f841ea79-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.522065 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445e77bd-611f-486b-af50-16e4476e29e4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.522076 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf68p\" (UniqueName: \"kubernetes.io/projected/445e77bd-611f-486b-af50-16e4476e29e4-kube-api-access-cf68p\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.522085 4628 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce2358a9-7f38-41e7-ba92-b82e8e98b458-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.522094 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgnh4\" (UniqueName: \"kubernetes.io/projected/eeedd8fe-1e9a-4009-a385-07d72fed1277-kube-api-access-cgnh4\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.522117 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nwg6\" (UniqueName: \"kubernetes.io/projected/ce2358a9-7f38-41e7-ba92-b82e8e98b458-kube-api-access-6nwg6\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.522126 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m597z\" (UniqueName: \"kubernetes.io/projected/a27ccd53-7ae0-4c98-9461-b545f841ea79-kube-api-access-m597z\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.522134 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeedd8fe-1e9a-4009-a385-07d72fed1277-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.522149 4628 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ce2358a9-7f38-41e7-ba92-b82e8e98b458-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.523384 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a27ccd53-7ae0-4c98-9461-b545f841ea79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a27ccd53-7ae0-4c98-9461-b545f841ea79" (UID: "a27ccd53-7ae0-4c98-9461-b545f841ea79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.623521 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a27ccd53-7ae0-4c98-9461-b545f841ea79-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.888137 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hc475_fa15b6c3-774f-4d31-8b55-008c3786d329/registry-server/0.log" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.891265 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.896891 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5f88ec-256a-4556-b06b-814dfa23c87b" path="/var/lib/kubelet/pods/3d5f88ec-256a-4556-b06b-814dfa23c87b/volumes" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.929198 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h266r\" (UniqueName: \"kubernetes.io/projected/fa15b6c3-774f-4d31-8b55-008c3786d329-kube-api-access-h266r\") pod \"fa15b6c3-774f-4d31-8b55-008c3786d329\" (UID: \"fa15b6c3-774f-4d31-8b55-008c3786d329\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.929289 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa15b6c3-774f-4d31-8b55-008c3786d329-catalog-content\") pod \"fa15b6c3-774f-4d31-8b55-008c3786d329\" (UID: \"fa15b6c3-774f-4d31-8b55-008c3786d329\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.929319 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa15b6c3-774f-4d31-8b55-008c3786d329-utilities\") pod \"fa15b6c3-774f-4d31-8b55-008c3786d329\" (UID: \"fa15b6c3-774f-4d31-8b55-008c3786d329\") " Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.930217 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa15b6c3-774f-4d31-8b55-008c3786d329-utilities" (OuterVolumeSpecName: "utilities") pod "fa15b6c3-774f-4d31-8b55-008c3786d329" (UID: "fa15b6c3-774f-4d31-8b55-008c3786d329"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:51 crc kubenswrapper[4628]: I1211 05:17:51.937320 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa15b6c3-774f-4d31-8b55-008c3786d329-kube-api-access-h266r" (OuterVolumeSpecName: "kube-api-access-h266r") pod "fa15b6c3-774f-4d31-8b55-008c3786d329" (UID: "fa15b6c3-774f-4d31-8b55-008c3786d329"). InnerVolumeSpecName "kube-api-access-h266r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.031009 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h266r\" (UniqueName: \"kubernetes.io/projected/fa15b6c3-774f-4d31-8b55-008c3786d329-kube-api-access-h266r\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.031041 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa15b6c3-774f-4d31-8b55-008c3786d329-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.040642 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa15b6c3-774f-4d31-8b55-008c3786d329-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa15b6c3-774f-4d31-8b55-008c3786d329" (UID: "fa15b6c3-774f-4d31-8b55-008c3786d329"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.131552 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa15b6c3-774f-4d31-8b55-008c3786d329-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.132895 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcx8h" event={"ID":"a27ccd53-7ae0-4c98-9461-b545f841ea79","Type":"ContainerDied","Data":"822281fc7c659db96eff7124c81359484a44641b3f919d676ff743dee7430497"} Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.132931 4628 scope.go:117] "RemoveContainer" containerID="8ade6dee0b74cc16615e80d499782062e0cd15d0e279a9e2c159df3ac099dc36" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.133004 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcx8h" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.136445 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hc475_fa15b6c3-774f-4d31-8b55-008c3786d329/registry-server/0.log" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.137109 4628 generic.go:334] "Generic (PLEG): container finished" podID="fa15b6c3-774f-4d31-8b55-008c3786d329" containerID="d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6" exitCode=1 Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.137166 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hc475" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.137175 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc475" event={"ID":"fa15b6c3-774f-4d31-8b55-008c3786d329","Type":"ContainerDied","Data":"d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6"} Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.137367 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hc475" event={"ID":"fa15b6c3-774f-4d31-8b55-008c3786d329","Type":"ContainerDied","Data":"49e497a0cd159195b67846b75c0b9b306a2849e98f92e5e0fc9aa570bd09494c"} Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.140730 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbrk4" event={"ID":"eeedd8fe-1e9a-4009-a385-07d72fed1277","Type":"ContainerDied","Data":"956ed7596c0d0812a671499e30cafa1b2174e9691cde28d8d8f5a251a7596803"} Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.140804 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbrk4" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.143814 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" event={"ID":"ce2358a9-7f38-41e7-ba92-b82e8e98b458","Type":"ContainerDied","Data":"0424bead60a9b2dc8491f36eec8d0cdbce4a923654d43cc0322ba04b83ef11a1"} Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.143873 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6vrgl" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.146750 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjcbm" event={"ID":"2a9eb6ef-92ff-415b-a526-26711b88985f","Type":"ContainerStarted","Data":"df897a4d7cc9cd97c4f88ee500c911ebdb2aed99e8d1f33249c1686c5dc263d7"} Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.146916 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mjcbm" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" containerName="registry-server" containerID="cri-o://df897a4d7cc9cd97c4f88ee500c911ebdb2aed99e8d1f33249c1686c5dc263d7" gracePeriod=30 Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.153249 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjbgr" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.153399 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjbgr" event={"ID":"445e77bd-611f-486b-af50-16e4476e29e4","Type":"ContainerDied","Data":"3615ecc86be0049d3d99e018ac06758193dedb0ce7d1246c4f8d167a546aa1ba"} Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.161636 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dcx8h"] Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.165835 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dcx8h"] Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.167636 4628 scope.go:117] "RemoveContainer" containerID="450f202379c71d93ad9aaccb996e9dcb4a0258cf83bb420f1fb11ab2a44654be" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.175277 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6vrgl"] Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.189598 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6vrgl"] Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.195179 4628 scope.go:117] "RemoveContainer" containerID="c001a154c7e0c4df6998f947574f71c78e47031ea578290737e99fc4b5c40ece" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.199112 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbrk4"] Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.208574 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbrk4"] Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.213158 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mjcbm" podStartSLOduration=3.841228734 podStartE2EDuration="1m53.213142547s" podCreationTimestamp="2025-12-11 05:15:59 +0000 UTC" firstStartedPulling="2025-12-11 05:16:01.856511891 +0000 UTC m=+64.273858589" lastFinishedPulling="2025-12-11 05:17:51.228425704 +0000 UTC m=+173.645772402" observedRunningTime="2025-12-11 05:17:52.20353926 +0000 UTC m=+174.620885958" watchObservedRunningTime="2025-12-11 05:17:52.213142547 +0000 UTC m=+174.630489245" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.222191 4628 scope.go:117] "RemoveContainer" containerID="d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.248111 4628 scope.go:117] "RemoveContainer" containerID="edcbc9945c5b08a4077e337f5641e44cdf35c924d57a71af42326afb6b9754df" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.251457 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hc475"] Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.253881 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hc475"] Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.265473 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hjbgr"] Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.268903 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hjbgr"] Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.277555 4628 scope.go:117] "RemoveContainer" containerID="7548293186cbe858d548c38beeff0a2cec9d088f66ca33206629e264a366f749" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.308341 4628 scope.go:117] "RemoveContainer" containerID="d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.308721 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6\": container with ID starting with d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6 not found: ID does not exist" containerID="d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.308762 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6"} err="failed to get container status \"d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6\": rpc error: code = NotFound desc = could not find container \"d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6\": container with ID starting with d0e9f1a26de677d30070b06da3acbc5328041df6c0226279be11261a48a1f1d6 not found: ID does not exist" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.308793 4628 scope.go:117] "RemoveContainer" containerID="edcbc9945c5b08a4077e337f5641e44cdf35c924d57a71af42326afb6b9754df" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.309140 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edcbc9945c5b08a4077e337f5641e44cdf35c924d57a71af42326afb6b9754df\": container with ID starting with edcbc9945c5b08a4077e337f5641e44cdf35c924d57a71af42326afb6b9754df not found: ID does not exist" containerID="edcbc9945c5b08a4077e337f5641e44cdf35c924d57a71af42326afb6b9754df" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.309237 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edcbc9945c5b08a4077e337f5641e44cdf35c924d57a71af42326afb6b9754df"} err="failed to get container status \"edcbc9945c5b08a4077e337f5641e44cdf35c924d57a71af42326afb6b9754df\": rpc error: code = NotFound desc = could not find container \"edcbc9945c5b08a4077e337f5641e44cdf35c924d57a71af42326afb6b9754df\": container with ID starting with edcbc9945c5b08a4077e337f5641e44cdf35c924d57a71af42326afb6b9754df not found: ID does not exist" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.309328 4628 scope.go:117] "RemoveContainer" containerID="7548293186cbe858d548c38beeff0a2cec9d088f66ca33206629e264a366f749" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.309604 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7548293186cbe858d548c38beeff0a2cec9d088f66ca33206629e264a366f749\": container with ID starting with 7548293186cbe858d548c38beeff0a2cec9d088f66ca33206629e264a366f749 not found: ID does not exist" containerID="7548293186cbe858d548c38beeff0a2cec9d088f66ca33206629e264a366f749" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.309627 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7548293186cbe858d548c38beeff0a2cec9d088f66ca33206629e264a366f749"} err="failed to get container status \"7548293186cbe858d548c38beeff0a2cec9d088f66ca33206629e264a366f749\": rpc error: code = NotFound desc = could not find container \"7548293186cbe858d548c38beeff0a2cec9d088f66ca33206629e264a366f749\": container with ID starting with 7548293186cbe858d548c38beeff0a2cec9d088f66ca33206629e264a366f749 not found: ID does not exist" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.309643 4628 scope.go:117] "RemoveContainer" containerID="a2321030527cd26a6ef15e2ea3e3407f2f48c539b82fa8ce81f3facb6965e2f4" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.323737 4628 scope.go:117] "RemoveContainer" containerID="27e14cf1997491c2be8267060246a6688618f93c19a3b79e88a5c74b708b126a" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.336451 4628 scope.go:117] "RemoveContainer" containerID="34d85b8972bda27137ed5e69adf8197995290ad2c08f832f85adc69aa44b690a" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.349958 4628 scope.go:117] "RemoveContainer" containerID="0924ac6e1d31b618731ffe5cff5071db9f2e1b0f1e54a28173a4d10a18fca739" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.362793 4628 scope.go:117] "RemoveContainer" containerID="c402e0a0353425ce6902e2c03863aa1f2b70a31113d19f8adf371a76d50d4371" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.400425 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qszrs"] Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.400711 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeedd8fe-1e9a-4009-a385-07d72fed1277" containerName="extract-content" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.400763 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeedd8fe-1e9a-4009-a385-07d72fed1277" containerName="extract-content" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.400779 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445e77bd-611f-486b-af50-16e4476e29e4" containerName="registry-server" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.400785 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="445e77bd-611f-486b-af50-16e4476e29e4" containerName="registry-server" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.400793 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27ccd53-7ae0-4c98-9461-b545f841ea79" containerName="extract-utilities" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.400861 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27ccd53-7ae0-4c98-9461-b545f841ea79" containerName="extract-utilities" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.400872 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeedd8fe-1e9a-4009-a385-07d72fed1277" containerName="registry-server" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.400878 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeedd8fe-1e9a-4009-a385-07d72fed1277" containerName="registry-server" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.400886 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5f88ec-256a-4556-b06b-814dfa23c87b" containerName="extract-content" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.400891 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5f88ec-256a-4556-b06b-814dfa23c87b" containerName="extract-content" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.400897 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5f88ec-256a-4556-b06b-814dfa23c87b" containerName="extract-utilities" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.400903 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5f88ec-256a-4556-b06b-814dfa23c87b" containerName="extract-utilities" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.400911 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa15b6c3-774f-4d31-8b55-008c3786d329" containerName="extract-utilities" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.400917 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa15b6c3-774f-4d31-8b55-008c3786d329" containerName="extract-utilities" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.400947 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeedd8fe-1e9a-4009-a385-07d72fed1277" containerName="extract-utilities" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.400953 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeedd8fe-1e9a-4009-a385-07d72fed1277" containerName="extract-utilities" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.400960 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27ccd53-7ae0-4c98-9461-b545f841ea79" containerName="extract-content" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.400965 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27ccd53-7ae0-4c98-9461-b545f841ea79" containerName="extract-content" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.400972 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa15b6c3-774f-4d31-8b55-008c3786d329" containerName="registry-server" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.400980 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa15b6c3-774f-4d31-8b55-008c3786d329" containerName="registry-server" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.400986 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2358a9-7f38-41e7-ba92-b82e8e98b458" containerName="marketplace-operator" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.400991 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2358a9-7f38-41e7-ba92-b82e8e98b458" containerName="marketplace-operator" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.400999 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445e77bd-611f-486b-af50-16e4476e29e4" containerName="extract-utilities" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.401024 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="445e77bd-611f-486b-af50-16e4476e29e4" containerName="extract-utilities" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.401032 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27ccd53-7ae0-4c98-9461-b545f841ea79" containerName="registry-server" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.401040 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27ccd53-7ae0-4c98-9461-b545f841ea79" containerName="registry-server" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.401049 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa15b6c3-774f-4d31-8b55-008c3786d329" containerName="extract-content" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.401056 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa15b6c3-774f-4d31-8b55-008c3786d329" containerName="extract-content" Dec 11 05:17:52 crc kubenswrapper[4628]: E1211 05:17:52.401067 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445e77bd-611f-486b-af50-16e4476e29e4" containerName="extract-content" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.401072 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="445e77bd-611f-486b-af50-16e4476e29e4" containerName="extract-content" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.401200 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="445e77bd-611f-486b-af50-16e4476e29e4" containerName="registry-server" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.401208 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa15b6c3-774f-4d31-8b55-008c3786d329" containerName="registry-server" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.401215 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27ccd53-7ae0-4c98-9461-b545f841ea79" containerName="registry-server" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.401225 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce2358a9-7f38-41e7-ba92-b82e8e98b458" containerName="marketplace-operator" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.401234 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5f88ec-256a-4556-b06b-814dfa23c87b" containerName="extract-content" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.401278 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeedd8fe-1e9a-4009-a385-07d72fed1277" containerName="registry-server" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.402209 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.403828 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qszrs"] Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.440683 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.442151 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-catalog-content\") pod \"certified-operators-qszrs\" (UID: \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\") " pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.442193 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjcmv\" (UniqueName: \"kubernetes.io/projected/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-kube-api-access-mjcmv\") pod \"certified-operators-qszrs\" (UID: \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\") " pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.442221 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-utilities\") pod \"certified-operators-qszrs\" (UID: \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\") " pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.543080 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-catalog-content\") pod \"certified-operators-qszrs\" (UID: \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\") " pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.543396 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjcmv\" (UniqueName: \"kubernetes.io/projected/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-kube-api-access-mjcmv\") pod \"certified-operators-qszrs\" (UID: \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\") " pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.543418 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-utilities\") pod \"certified-operators-qszrs\" (UID: \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\") " pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.543621 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-catalog-content\") pod \"certified-operators-qszrs\" (UID: \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\") " pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.544087 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-utilities\") pod \"certified-operators-qszrs\" (UID: \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\") " pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.564730 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjcmv\" (UniqueName: \"kubernetes.io/projected/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-kube-api-access-mjcmv\") pod \"certified-operators-qszrs\" (UID: \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\") " pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.760386 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.771088 4628 scope.go:117] "RemoveContainer" containerID="728fed19e4d96090f7d567c888d11fec091d97157d2c1099c013200bd7b964ec" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.785405 4628 scope.go:117] "RemoveContainer" containerID="2ea49cc63a78661f88faba3e5633077de8433dd05e4991bb1f381eefebdb8ff5" Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.990373 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8bd7x"] Dec 11 05:17:52 crc kubenswrapper[4628]: I1211 05:17:52.991252 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.002150 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.005645 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bd7x"] Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.150182 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0ff0009-81bb-47da-aab8-5caeeec49061-utilities\") pod \"redhat-marketplace-8bd7x\" (UID: \"a0ff0009-81bb-47da-aab8-5caeeec49061\") " pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.150251 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52ff5\" (UniqueName: \"kubernetes.io/projected/a0ff0009-81bb-47da-aab8-5caeeec49061-kube-api-access-52ff5\") pod \"redhat-marketplace-8bd7x\" (UID: \"a0ff0009-81bb-47da-aab8-5caeeec49061\") " pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.150313 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0ff0009-81bb-47da-aab8-5caeeec49061-catalog-content\") pod \"redhat-marketplace-8bd7x\" (UID: \"a0ff0009-81bb-47da-aab8-5caeeec49061\") " pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.195206 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qszrs"] Dec 11 05:17:53 crc kubenswrapper[4628]: W1211 05:17:53.208985 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod596ff0c0_3caf_4bb3_b49a_ff3d4ed25adc.slice/crio-9ca5d856bc9d0ec2bb635083ed90445f1f06d379b104c7ca141937530a3a5be4 WatchSource:0}: Error finding container 9ca5d856bc9d0ec2bb635083ed90445f1f06d379b104c7ca141937530a3a5be4: Status 404 returned error can't find the container with id 9ca5d856bc9d0ec2bb635083ed90445f1f06d379b104c7ca141937530a3a5be4 Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.252014 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0ff0009-81bb-47da-aab8-5caeeec49061-utilities\") pod \"redhat-marketplace-8bd7x\" (UID: \"a0ff0009-81bb-47da-aab8-5caeeec49061\") " pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.252051 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52ff5\" (UniqueName: \"kubernetes.io/projected/a0ff0009-81bb-47da-aab8-5caeeec49061-kube-api-access-52ff5\") pod \"redhat-marketplace-8bd7x\" (UID: \"a0ff0009-81bb-47da-aab8-5caeeec49061\") " pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.252091 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0ff0009-81bb-47da-aab8-5caeeec49061-catalog-content\") pod \"redhat-marketplace-8bd7x\" (UID: \"a0ff0009-81bb-47da-aab8-5caeeec49061\") " pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.252477 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0ff0009-81bb-47da-aab8-5caeeec49061-catalog-content\") pod \"redhat-marketplace-8bd7x\" (UID: \"a0ff0009-81bb-47da-aab8-5caeeec49061\") " pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.252718 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0ff0009-81bb-47da-aab8-5caeeec49061-utilities\") pod \"redhat-marketplace-8bd7x\" (UID: \"a0ff0009-81bb-47da-aab8-5caeeec49061\") " pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.272971 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52ff5\" (UniqueName: \"kubernetes.io/projected/a0ff0009-81bb-47da-aab8-5caeeec49061-kube-api-access-52ff5\") pod \"redhat-marketplace-8bd7x\" (UID: \"a0ff0009-81bb-47da-aab8-5caeeec49061\") " pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.313950 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.801816 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bd7x"] Dec 11 05:17:53 crc kubenswrapper[4628]: W1211 05:17:53.821070 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0ff0009_81bb_47da_aab8_5caeeec49061.slice/crio-e7c47f5b8d83d5b0aadb8e7f048749d5a4fca38d3b8f539a1ce1ca5d478390e0 WatchSource:0}: Error finding container e7c47f5b8d83d5b0aadb8e7f048749d5a4fca38d3b8f539a1ce1ca5d478390e0: Status 404 returned error can't find the container with id e7c47f5b8d83d5b0aadb8e7f048749d5a4fca38d3b8f539a1ce1ca5d478390e0 Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.904542 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="445e77bd-611f-486b-af50-16e4476e29e4" path="/var/lib/kubelet/pods/445e77bd-611f-486b-af50-16e4476e29e4/volumes" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.905933 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a27ccd53-7ae0-4c98-9461-b545f841ea79" path="/var/lib/kubelet/pods/a27ccd53-7ae0-4c98-9461-b545f841ea79/volumes" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.907202 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce2358a9-7f38-41e7-ba92-b82e8e98b458" path="/var/lib/kubelet/pods/ce2358a9-7f38-41e7-ba92-b82e8e98b458/volumes" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.910724 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeedd8fe-1e9a-4009-a385-07d72fed1277" path="/var/lib/kubelet/pods/eeedd8fe-1e9a-4009-a385-07d72fed1277/volumes" Dec 11 05:17:53 crc kubenswrapper[4628]: I1211 05:17:53.911401 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa15b6c3-774f-4d31-8b55-008c3786d329" path="/var/lib/kubelet/pods/fa15b6c3-774f-4d31-8b55-008c3786d329/volumes" Dec 11 05:17:54 crc kubenswrapper[4628]: I1211 05:17:54.199797 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bd7x" event={"ID":"a0ff0009-81bb-47da-aab8-5caeeec49061","Type":"ContainerStarted","Data":"e7c47f5b8d83d5b0aadb8e7f048749d5a4fca38d3b8f539a1ce1ca5d478390e0"} Dec 11 05:17:54 crc kubenswrapper[4628]: I1211 05:17:54.201937 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qszrs" event={"ID":"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc","Type":"ContainerStarted","Data":"9ca5d856bc9d0ec2bb635083ed90445f1f06d379b104c7ca141937530a3a5be4"} Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.398964 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zj2jl"] Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.401318 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.406933 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.408222 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zj2jl"] Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.582535 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbnrh\" (UniqueName: \"kubernetes.io/projected/1fd0b2ff-4fee-40e6-b0d7-408c706be731-kube-api-access-bbnrh\") pod \"community-operators-zj2jl\" (UID: \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\") " pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.582980 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd0b2ff-4fee-40e6-b0d7-408c706be731-utilities\") pod \"community-operators-zj2jl\" (UID: \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\") " pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.583026 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd0b2ff-4fee-40e6-b0d7-408c706be731-catalog-content\") pod \"community-operators-zj2jl\" (UID: \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\") " pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.683629 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd0b2ff-4fee-40e6-b0d7-408c706be731-catalog-content\") pod \"community-operators-zj2jl\" (UID: \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\") " pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.683681 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbnrh\" (UniqueName: \"kubernetes.io/projected/1fd0b2ff-4fee-40e6-b0d7-408c706be731-kube-api-access-bbnrh\") pod \"community-operators-zj2jl\" (UID: \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\") " pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.683740 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd0b2ff-4fee-40e6-b0d7-408c706be731-utilities\") pod \"community-operators-zj2jl\" (UID: \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\") " pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.684226 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd0b2ff-4fee-40e6-b0d7-408c706be731-catalog-content\") pod \"community-operators-zj2jl\" (UID: \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\") " pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.684296 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd0b2ff-4fee-40e6-b0d7-408c706be731-utilities\") pod \"community-operators-zj2jl\" (UID: \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\") " pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.718358 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbnrh\" (UniqueName: \"kubernetes.io/projected/1fd0b2ff-4fee-40e6-b0d7-408c706be731-kube-api-access-bbnrh\") pod \"community-operators-zj2jl\" (UID: \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\") " pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.721220 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:17:55 crc kubenswrapper[4628]: I1211 05:17:55.948798 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zj2jl"] Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.218485 4628 generic.go:334] "Generic (PLEG): container finished" podID="a0ff0009-81bb-47da-aab8-5caeeec49061" containerID="bd4988d5c137ba6c2b54049a7ea3fa6546ac8463b77ce83a9b7438ce06ca539c" exitCode=0 Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.218593 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bd7x" event={"ID":"a0ff0009-81bb-47da-aab8-5caeeec49061","Type":"ContainerDied","Data":"bd4988d5c137ba6c2b54049a7ea3fa6546ac8463b77ce83a9b7438ce06ca539c"} Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.223269 4628 generic.go:334] "Generic (PLEG): container finished" podID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" containerID="04282ed96b10bb7a59ba3deb58b3349bd7a5907b52535d88c99c73a3f7791e8c" exitCode=0 Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.223778 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qszrs" event={"ID":"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc","Type":"ContainerDied","Data":"04282ed96b10bb7a59ba3deb58b3349bd7a5907b52535d88c99c73a3f7791e8c"} Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.228998 4628 generic.go:334] "Generic (PLEG): container finished" podID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" containerID="80cb6292459b5acb75f4c42a25864d6fc33e898ee203772a368a2c28b9921124" exitCode=0 Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.229059 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj2jl" event={"ID":"1fd0b2ff-4fee-40e6-b0d7-408c706be731","Type":"ContainerDied","Data":"80cb6292459b5acb75f4c42a25864d6fc33e898ee203772a368a2c28b9921124"} Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.229085 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj2jl" event={"ID":"1fd0b2ff-4fee-40e6-b0d7-408c706be731","Type":"ContainerStarted","Data":"e36ff131183baaf6b37ddea4b497f76b59aa24730cd2ad35f6a4ba0790e90594"} Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.232672 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mjcbm_2a9eb6ef-92ff-415b-a526-26711b88985f/registry-server/0.log" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.234853 4628 generic.go:334] "Generic (PLEG): container finished" podID="2a9eb6ef-92ff-415b-a526-26711b88985f" containerID="df897a4d7cc9cd97c4f88ee500c911ebdb2aed99e8d1f33249c1686c5dc263d7" exitCode=1 Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.234900 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjcbm" event={"ID":"2a9eb6ef-92ff-415b-a526-26711b88985f","Type":"ContainerDied","Data":"df897a4d7cc9cd97c4f88ee500c911ebdb2aed99e8d1f33249c1686c5dc263d7"} Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.605368 4628 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.607102 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0" gracePeriod=15 Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.607139 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453" gracePeriod=15 Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.607158 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6" gracePeriod=15 Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.607173 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a" gracePeriod=15 Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.607222 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913" gracePeriod=15 Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.612190 4628 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 11 05:17:56 crc kubenswrapper[4628]: E1211 05:17:56.612544 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.612571 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 11 05:17:56 crc kubenswrapper[4628]: E1211 05:17:56.612596 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.612608 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 11 05:17:56 crc kubenswrapper[4628]: E1211 05:17:56.612627 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.612641 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 11 05:17:56 crc kubenswrapper[4628]: E1211 05:17:56.612654 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.612667 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 11 05:17:56 crc kubenswrapper[4628]: E1211 05:17:56.613337 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.613800 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 11 05:17:56 crc kubenswrapper[4628]: E1211 05:17:56.613825 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.613839 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.614137 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.614164 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.614183 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.614198 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.614223 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.614241 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 11 05:17:56 crc kubenswrapper[4628]: E1211 05:17:56.614404 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.614420 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.633374 4628 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.638383 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.644964 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:56 crc kubenswrapper[4628]: E1211 05:17:56.680325 4628 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.18:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.696305 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.696572 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.696673 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.696754 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.696924 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.697004 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.697083 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.697177 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.797881 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.797949 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.797993 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.797996 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.798054 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.798056 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.798078 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.798145 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.798198 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.798222 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.798275 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.798350 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.798388 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.798392 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.798382 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.798416 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:56 crc kubenswrapper[4628]: E1211 05:17:56.824774 4628 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.18:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-8bd7x.1880117badd3dc79 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-8bd7x,UID:a0ff0009-81bb-47da-aab8-5caeeec49061,APIVersion:v1,ResourceVersion:29428,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\" in 602ms (602ms including waiting). Image size: 1154573130 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-11 05:17:56.824226937 +0000 UTC m=+179.241573645,LastTimestamp:2025-12-11 05:17:56.824226937 +0000 UTC m=+179.241573645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.949017 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mjcbm_2a9eb6ef-92ff-415b-a526-26711b88985f/registry-server/0.log" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.949657 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.950108 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.950516 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:56 crc kubenswrapper[4628]: I1211 05:17:56.981372 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.005057 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a9eb6ef-92ff-415b-a526-26711b88985f-catalog-content\") pod \"2a9eb6ef-92ff-415b-a526-26711b88985f\" (UID: \"2a9eb6ef-92ff-415b-a526-26711b88985f\") " Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.005165 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a9eb6ef-92ff-415b-a526-26711b88985f-utilities\") pod \"2a9eb6ef-92ff-415b-a526-26711b88985f\" (UID: \"2a9eb6ef-92ff-415b-a526-26711b88985f\") " Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.006914 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2h92\" (UniqueName: \"kubernetes.io/projected/2a9eb6ef-92ff-415b-a526-26711b88985f-kube-api-access-r2h92\") pod \"2a9eb6ef-92ff-415b-a526-26711b88985f\" (UID: \"2a9eb6ef-92ff-415b-a526-26711b88985f\") " Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.007602 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a9eb6ef-92ff-415b-a526-26711b88985f-utilities" (OuterVolumeSpecName: "utilities") pod "2a9eb6ef-92ff-415b-a526-26711b88985f" (UID: "2a9eb6ef-92ff-415b-a526-26711b88985f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.012288 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a9eb6ef-92ff-415b-a526-26711b88985f-kube-api-access-r2h92" (OuterVolumeSpecName: "kube-api-access-r2h92") pod "2a9eb6ef-92ff-415b-a526-26711b88985f" (UID: "2a9eb6ef-92ff-415b-a526-26711b88985f"). InnerVolumeSpecName "kube-api-access-r2h92". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.108415 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a9eb6ef-92ff-415b-a526-26711b88985f-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.108451 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2h92\" (UniqueName: \"kubernetes.io/projected/2a9eb6ef-92ff-415b-a526-26711b88985f-kube-api-access-r2h92\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.137333 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a9eb6ef-92ff-415b-a526-26711b88985f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a9eb6ef-92ff-415b-a526-26711b88985f" (UID: "2a9eb6ef-92ff-415b-a526-26711b88985f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.208795 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a9eb6ef-92ff-415b-a526-26711b88985f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.242089 4628 generic.go:334] "Generic (PLEG): container finished" podID="a0ff0009-81bb-47da-aab8-5caeeec49061" containerID="9f3e08583c9181503461100d5b5412acead5736313cdaf9cde059ca5c76ae4fc" exitCode=0 Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.242769 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bd7x" event={"ID":"a0ff0009-81bb-47da-aab8-5caeeec49061","Type":"ContainerDied","Data":"9f3e08583c9181503461100d5b5412acead5736313cdaf9cde059ca5c76ae4fc"} Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.243676 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.243872 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.244029 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.245400 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj2jl" event={"ID":"1fd0b2ff-4fee-40e6-b0d7-408c706be731","Type":"ContainerStarted","Data":"d23c426965c0b40df54a03ff376fd87ece8f1df2b1fbbec25b6efa8201250c79"} Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.246053 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.246429 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.246623 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.246818 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.249299 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.250546 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.251161 4628 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913" exitCode=0 Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.251191 4628 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453" exitCode=0 Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.251198 4628 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6" exitCode=0 Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.251208 4628 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a" exitCode=2 Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.251276 4628 scope.go:117] "RemoveContainer" containerID="86eafba1edb23013c7f70c5182bc61fd6af5e475a6b40b143dbf567b504b8bd1" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.258430 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mjcbm_2a9eb6ef-92ff-415b-a526-26711b88985f/registry-server/0.log" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.259318 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mjcbm" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.260955 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjcbm" event={"ID":"2a9eb6ef-92ff-415b-a526-26711b88985f","Type":"ContainerDied","Data":"e021ca4df42fd3927f44c33b397a758b31acd76861c9c363a38a0fa3253e19e5"} Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.261582 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.261880 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.262199 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.262506 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.262546 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"865b940fbed0638bef0c642eca19263ab54b01390226d437483c0cb2f8fa421c"} Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.263940 4628 generic.go:334] "Generic (PLEG): container finished" podID="843a2bbd-6914-4686-a14e-f05f88ddcc07" containerID="b7235605250fb4de4008c27b31113cf26e3f35f9166ef4c21f46118f2bb45295" exitCode=0 Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.263989 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"843a2bbd-6914-4686-a14e-f05f88ddcc07","Type":"ContainerDied","Data":"b7235605250fb4de4008c27b31113cf26e3f35f9166ef4c21f46118f2bb45295"} Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.264467 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.264707 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.265039 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.265588 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.266986 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.302880 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.303196 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.303406 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.303664 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.303878 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.321620 4628 scope.go:117] "RemoveContainer" containerID="df897a4d7cc9cd97c4f88ee500c911ebdb2aed99e8d1f33249c1686c5dc263d7" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.338582 4628 scope.go:117] "RemoveContainer" containerID="385e6d17700b0b38ac74626d67f6cdb695c2bfdf6c08b48ab5ffba588eac39bf" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.352911 4628 scope.go:117] "RemoveContainer" containerID="5238591181beb8227757f3be992574259e6f93106e01c5f6c313116e19ae4c76" Dec 11 05:17:57 crc kubenswrapper[4628]: E1211 05:17:57.390055 4628 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: E1211 05:17:57.390885 4628 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: E1211 05:17:57.391196 4628 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: E1211 05:17:57.391526 4628 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: E1211 05:17:57.391748 4628 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.391794 4628 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 11 05:17:57 crc kubenswrapper[4628]: E1211 05:17:57.392044 4628 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="200ms" Dec 11 05:17:57 crc kubenswrapper[4628]: E1211 05:17:57.592825 4628 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="400ms" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.896543 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.899508 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.902702 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.903165 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: I1211 05:17:57.903469 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:57 crc kubenswrapper[4628]: E1211 05:17:57.993946 4628 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="800ms" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.285872 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.289585 4628 generic.go:334] "Generic (PLEG): container finished" podID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" containerID="d23c426965c0b40df54a03ff376fd87ece8f1df2b1fbbec25b6efa8201250c79" exitCode=0 Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.289663 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj2jl" event={"ID":"1fd0b2ff-4fee-40e6-b0d7-408c706be731","Type":"ContainerDied","Data":"d23c426965c0b40df54a03ff376fd87ece8f1df2b1fbbec25b6efa8201250c79"} Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.290601 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.291053 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.291363 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.291624 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.292207 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"df192c79190a0951e9429a8103abff2c2c5618e60acf5d38722c091c2a1574a6"} Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.293153 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: E1211 05:17:58.293377 4628 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.18:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.293784 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.294054 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.294375 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.296679 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bd7x" event={"ID":"a0ff0009-81bb-47da-aab8-5caeeec49061","Type":"ContainerStarted","Data":"55a03cc66cfbf3e9cd834bce381975f5ce52a00bcf85b7e47d3f6d0df1a62e07"} Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.297655 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.298252 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.298558 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.298829 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.299102 4628 generic.go:334] "Generic (PLEG): container finished" podID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" containerID="7dbec7998d2d42bfaebd1295042a1dd4837b2bf2c0653331e24d6b18480698f6" exitCode=0 Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.299735 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qszrs" event={"ID":"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc","Type":"ContainerDied","Data":"7dbec7998d2d42bfaebd1295042a1dd4837b2bf2c0653331e24d6b18480698f6"} Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.300128 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.300337 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.300525 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.300991 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.301291 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.606016 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.606580 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.607059 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.607312 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.607527 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.607776 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.722199 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/843a2bbd-6914-4686-a14e-f05f88ddcc07-var-lock\") pod \"843a2bbd-6914-4686-a14e-f05f88ddcc07\" (UID: \"843a2bbd-6914-4686-a14e-f05f88ddcc07\") " Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.722544 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/843a2bbd-6914-4686-a14e-f05f88ddcc07-kubelet-dir\") pod \"843a2bbd-6914-4686-a14e-f05f88ddcc07\" (UID: \"843a2bbd-6914-4686-a14e-f05f88ddcc07\") " Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.722571 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/843a2bbd-6914-4686-a14e-f05f88ddcc07-kube-api-access\") pod \"843a2bbd-6914-4686-a14e-f05f88ddcc07\" (UID: \"843a2bbd-6914-4686-a14e-f05f88ddcc07\") " Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.722411 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/843a2bbd-6914-4686-a14e-f05f88ddcc07-var-lock" (OuterVolumeSpecName: "var-lock") pod "843a2bbd-6914-4686-a14e-f05f88ddcc07" (UID: "843a2bbd-6914-4686-a14e-f05f88ddcc07"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.722624 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/843a2bbd-6914-4686-a14e-f05f88ddcc07-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "843a2bbd-6914-4686-a14e-f05f88ddcc07" (UID: "843a2bbd-6914-4686-a14e-f05f88ddcc07"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.722776 4628 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/843a2bbd-6914-4686-a14e-f05f88ddcc07-var-lock\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.722791 4628 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/843a2bbd-6914-4686-a14e-f05f88ddcc07-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.741059 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/843a2bbd-6914-4686-a14e-f05f88ddcc07-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "843a2bbd-6914-4686-a14e-f05f88ddcc07" (UID: "843a2bbd-6914-4686-a14e-f05f88ddcc07"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:17:58 crc kubenswrapper[4628]: E1211 05:17:58.794748 4628 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="1.6s" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.823562 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/843a2bbd-6914-4686-a14e-f05f88ddcc07-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.961477 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.962160 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.962667 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.962863 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.963120 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.963513 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.963670 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:58 crc kubenswrapper[4628]: I1211 05:17:58.963825 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.127330 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.127476 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.127536 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.127569 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.127620 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.127713 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.127821 4628 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.127864 4628 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.127877 4628 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.306893 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qszrs" event={"ID":"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc","Type":"ContainerStarted","Data":"1662dde031ea99d90bae900963bcc4006659b1f78379200d09f20b9cb03b9a58"} Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.307423 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.307876 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.308170 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.308431 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.308695 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.308935 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.310061 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj2jl" event={"ID":"1fd0b2ff-4fee-40e6-b0d7-408c706be731","Type":"ContainerStarted","Data":"01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446"} Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.310801 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.311047 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.311271 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.311500 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.311749 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.312027 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.313836 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.314434 4628 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0" exitCode=0 Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.314498 4628 scope.go:117] "RemoveContainer" containerID="3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.314657 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.320640 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"843a2bbd-6914-4686-a14e-f05f88ddcc07","Type":"ContainerDied","Data":"4361ca9db65c08e41482440c49ed418b60376c136e650eaef275916633e53e66"} Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.320674 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4361ca9db65c08e41482440c49ed418b60376c136e650eaef275916633e53e66" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.320805 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 11 05:17:59 crc kubenswrapper[4628]: E1211 05:17:59.321027 4628 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.18:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.335302 4628 scope.go:117] "RemoveContainer" containerID="32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.341533 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.342017 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.342236 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.342418 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.342901 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.343064 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.346252 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.346398 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.346551 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.346686 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.346822 4628 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.347597 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.354216 4628 scope.go:117] "RemoveContainer" containerID="b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.367867 4628 scope.go:117] "RemoveContainer" containerID="f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.384978 4628 scope.go:117] "RemoveContainer" containerID="62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.407790 4628 scope.go:117] "RemoveContainer" containerID="59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.427006 4628 scope.go:117] "RemoveContainer" containerID="3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913" Dec 11 05:17:59 crc kubenswrapper[4628]: E1211 05:17:59.427469 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913\": container with ID starting with 3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913 not found: ID does not exist" containerID="3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.427511 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913"} err="failed to get container status \"3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913\": rpc error: code = NotFound desc = could not find container \"3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913\": container with ID starting with 3d741a2cbe15031dd2689b0f56a89a4671027c8d4520f89d26955ed5f83ac913 not found: ID does not exist" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.427538 4628 scope.go:117] "RemoveContainer" containerID="32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453" Dec 11 05:17:59 crc kubenswrapper[4628]: E1211 05:17:59.427869 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453\": container with ID starting with 32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453 not found: ID does not exist" containerID="32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.427914 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453"} err="failed to get container status \"32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453\": rpc error: code = NotFound desc = could not find container \"32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453\": container with ID starting with 32d87588e039f5452312557720f5985a726a905bb51912c2c7b35ecee3858453 not found: ID does not exist" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.427942 4628 scope.go:117] "RemoveContainer" containerID="b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6" Dec 11 05:17:59 crc kubenswrapper[4628]: E1211 05:17:59.428778 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6\": container with ID starting with b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6 not found: ID does not exist" containerID="b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.428806 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6"} err="failed to get container status \"b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6\": rpc error: code = NotFound desc = could not find container \"b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6\": container with ID starting with b32fc5b6abbb405e45900e4faa4990cca046cd21b5f284b0e6903388ec44fbd6 not found: ID does not exist" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.428824 4628 scope.go:117] "RemoveContainer" containerID="f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a" Dec 11 05:17:59 crc kubenswrapper[4628]: E1211 05:17:59.429410 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a\": container with ID starting with f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a not found: ID does not exist" containerID="f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.429432 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a"} err="failed to get container status \"f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a\": rpc error: code = NotFound desc = could not find container \"f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a\": container with ID starting with f6be3ef18bdf9e850ba13649f4bd7aa9fe150f3791ed3e7d8ccd5d8439fbd76a not found: ID does not exist" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.429447 4628 scope.go:117] "RemoveContainer" containerID="62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0" Dec 11 05:17:59 crc kubenswrapper[4628]: E1211 05:17:59.429711 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0\": container with ID starting with 62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0 not found: ID does not exist" containerID="62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.429743 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0"} err="failed to get container status \"62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0\": rpc error: code = NotFound desc = could not find container \"62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0\": container with ID starting with 62618b6f7436c23be40f65807a4b596cc5239cbc0a3bcb56392a432931cee1e0 not found: ID does not exist" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.429763 4628 scope.go:117] "RemoveContainer" containerID="59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b" Dec 11 05:17:59 crc kubenswrapper[4628]: E1211 05:17:59.430567 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b\": container with ID starting with 59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b not found: ID does not exist" containerID="59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.430589 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b"} err="failed to get container status \"59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b\": rpc error: code = NotFound desc = could not find container \"59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b\": container with ID starting with 59f7989976880036a55abd671090ee4fee824e3bcb358db073784780a1c01c8b not found: ID does not exist" Dec 11 05:17:59 crc kubenswrapper[4628]: I1211 05:17:59.896718 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 11 05:18:00 crc kubenswrapper[4628]: E1211 05:18:00.395957 4628 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="3.2s" Dec 11 05:18:01 crc kubenswrapper[4628]: E1211 05:18:01.147100 4628 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.18:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-8bd7x.1880117badd3dc79 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-8bd7x,UID:a0ff0009-81bb-47da-aab8-5caeeec49061,APIVersion:v1,ResourceVersion:29428,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\" in 602ms (602ms including waiting). Image size: 1154573130 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-11 05:17:56.824226937 +0000 UTC m=+179.241573645,LastTimestamp:2025-12-11 05:17:56.824226937 +0000 UTC m=+179.241573645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 11 05:18:01 crc kubenswrapper[4628]: I1211 05:18:01.426556 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:18:01 crc kubenswrapper[4628]: I1211 05:18:01.426651 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:18:02 crc kubenswrapper[4628]: I1211 05:18:02.760885 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:18:02 crc kubenswrapper[4628]: I1211 05:18:02.762009 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:18:02 crc kubenswrapper[4628]: I1211 05:18:02.823213 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:18:02 crc kubenswrapper[4628]: I1211 05:18:02.823592 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:02 crc kubenswrapper[4628]: I1211 05:18:02.823808 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:02 crc kubenswrapper[4628]: I1211 05:18:02.824055 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:02 crc kubenswrapper[4628]: I1211 05:18:02.824233 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:02 crc kubenswrapper[4628]: I1211 05:18:02.824437 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.314591 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.315289 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.371519 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.372668 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.373014 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.373367 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.373797 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.375377 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.405499 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qszrs" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.405954 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.406220 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.406369 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.406521 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.406659 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.420633 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8bd7x" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.421161 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.421411 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.421567 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.421710 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: I1211 05:18:03.421884 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:03 crc kubenswrapper[4628]: E1211 05:18:03.597815 4628 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="6.4s" Dec 11 05:18:05 crc kubenswrapper[4628]: I1211 05:18:05.722347 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:18:05 crc kubenswrapper[4628]: I1211 05:18:05.722433 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:18:05 crc kubenswrapper[4628]: I1211 05:18:05.779521 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:18:05 crc kubenswrapper[4628]: I1211 05:18:05.780422 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:05 crc kubenswrapper[4628]: I1211 05:18:05.780978 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:05 crc kubenswrapper[4628]: I1211 05:18:05.781548 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:05 crc kubenswrapper[4628]: I1211 05:18:05.782291 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:05 crc kubenswrapper[4628]: I1211 05:18:05.782828 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:06 crc kubenswrapper[4628]: I1211 05:18:06.443392 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zj2jl" Dec 11 05:18:06 crc kubenswrapper[4628]: I1211 05:18:06.444259 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:06 crc kubenswrapper[4628]: I1211 05:18:06.444937 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:06 crc kubenswrapper[4628]: I1211 05:18:06.445307 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:06 crc kubenswrapper[4628]: I1211 05:18:06.445730 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:06 crc kubenswrapper[4628]: I1211 05:18:06.446236 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:07 crc kubenswrapper[4628]: I1211 05:18:07.896778 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:07 crc kubenswrapper[4628]: I1211 05:18:07.897831 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:07 crc kubenswrapper[4628]: I1211 05:18:07.898502 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:07 crc kubenswrapper[4628]: I1211 05:18:07.898908 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:07 crc kubenswrapper[4628]: I1211 05:18:07.899189 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:09 crc kubenswrapper[4628]: E1211 05:18:09.998939 4628 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.18:6443: connect: connection refused" interval="7s" Dec 11 05:18:10 crc kubenswrapper[4628]: I1211 05:18:10.577417 4628 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 11 05:18:10 crc kubenswrapper[4628]: I1211 05:18:10.577559 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 11 05:18:10 crc kubenswrapper[4628]: I1211 05:18:10.888925 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:18:10 crc kubenswrapper[4628]: I1211 05:18:10.890585 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:10 crc kubenswrapper[4628]: I1211 05:18:10.891339 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:10 crc kubenswrapper[4628]: I1211 05:18:10.891891 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:10 crc kubenswrapper[4628]: I1211 05:18:10.892491 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:10 crc kubenswrapper[4628]: I1211 05:18:10.893155 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:10 crc kubenswrapper[4628]: I1211 05:18:10.913296 4628 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a98947e-b435-4c9f-8356-537c79cc8b22" Dec 11 05:18:10 crc kubenswrapper[4628]: I1211 05:18:10.913344 4628 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a98947e-b435-4c9f-8356-537c79cc8b22" Dec 11 05:18:10 crc kubenswrapper[4628]: E1211 05:18:10.914156 4628 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:18:10 crc kubenswrapper[4628]: I1211 05:18:10.914681 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:18:11 crc kubenswrapper[4628]: E1211 05:18:11.149060 4628 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.18:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-8bd7x.1880117badd3dc79 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-8bd7x,UID:a0ff0009-81bb-47da-aab8-5caeeec49061,APIVersion:v1,ResourceVersion:29428,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\" in 602ms (602ms including waiting). Image size: 1154573130 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-11 05:17:56.824226937 +0000 UTC m=+179.241573645,LastTimestamp:2025-12-11 05:17:56.824226937 +0000 UTC m=+179.241573645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 11 05:18:11 crc kubenswrapper[4628]: I1211 05:18:11.406632 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"08711e70bdeb3f1cdbbcf2792c699ec752b745a5bc461c8c1ae72eca756d8387"} Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.424821 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.425177 4628 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c1e544f479d3ab726765f5a5b361070e9f87062533a676c46064b447c9469eb5" exitCode=1 Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.425267 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c1e544f479d3ab726765f5a5b361070e9f87062533a676c46064b447c9469eb5"} Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.425918 4628 scope.go:117] "RemoveContainer" containerID="c1e544f479d3ab726765f5a5b361070e9f87062533a676c46064b447c9469eb5" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.426645 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.427016 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.427572 4628 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.428017 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.428365 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.428640 4628 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="fca6839f9d91650e1566ef1ddf8c79ee7e3b0a735978989ea72ca519704bd567" exitCode=0 Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.428700 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"fca6839f9d91650e1566ef1ddf8c79ee7e3b0a735978989ea72ca519704bd567"} Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.428895 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.429379 4628 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a98947e-b435-4c9f-8356-537c79cc8b22" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.429412 4628 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a98947e-b435-4c9f-8356-537c79cc8b22" Dec 11 05:18:13 crc kubenswrapper[4628]: E1211 05:18:13.429891 4628 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.429908 4628 status_manager.go:851] "Failed to get status for pod" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" pod="openshift-marketplace/community-operators-zj2jl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-zj2jl\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.430445 4628 status_manager.go:851] "Failed to get status for pod" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" pod="openshift-marketplace/certified-operators-qszrs" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qszrs\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.431058 4628 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.431542 4628 status_manager.go:851] "Failed to get status for pod" podUID="a0ff0009-81bb-47da-aab8-5caeeec49061" pod="openshift-marketplace/redhat-marketplace-8bd7x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-8bd7x\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.432005 4628 status_manager.go:851] "Failed to get status for pod" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:13 crc kubenswrapper[4628]: I1211 05:18:13.432530 4628 status_manager.go:851] "Failed to get status for pod" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" pod="openshift-marketplace/redhat-operators-mjcbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-mjcbm\": dial tcp 38.102.83.18:6443: connect: connection refused" Dec 11 05:18:14 crc kubenswrapper[4628]: I1211 05:18:14.436682 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 11 05:18:14 crc kubenswrapper[4628]: I1211 05:18:14.437018 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9b6260d2682122df8e1dbaa53c54cc98a51875df939b1e03b10585056cf5e221"} Dec 11 05:18:14 crc kubenswrapper[4628]: I1211 05:18:14.440601 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"621198e42ad6c3146538901d2b347261f3de0ec6105e7bcfe71c02c7ac5af0c3"} Dec 11 05:18:14 crc kubenswrapper[4628]: I1211 05:18:14.440661 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7b88d2b5fb4c8013b41d8bc55289ec92727030a633ffea3d5b9487b30b7baf95"} Dec 11 05:18:14 crc kubenswrapper[4628]: I1211 05:18:14.440678 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c1c2412863d12352fb77a8d9c0f33e1de6d2d77c5b34a0f448c7d26e28f4d585"} Dec 11 05:18:15 crc kubenswrapper[4628]: I1211 05:18:15.448117 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f5913a83f04d5eb32485a3b3bbbb5b56760c937f5e56f7caaddf65f95be7fe33"} Dec 11 05:18:15 crc kubenswrapper[4628]: I1211 05:18:15.448441 4628 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a98947e-b435-4c9f-8356-537c79cc8b22" Dec 11 05:18:15 crc kubenswrapper[4628]: I1211 05:18:15.448470 4628 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a98947e-b435-4c9f-8356-537c79cc8b22" Dec 11 05:18:15 crc kubenswrapper[4628]: I1211 05:18:15.448470 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:18:15 crc kubenswrapper[4628]: I1211 05:18:15.448644 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"19f97c18d33384faecd22c1858ca379dfc4ad85fc661c948bc4de71854ad8110"} Dec 11 05:18:15 crc kubenswrapper[4628]: I1211 05:18:15.535428 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:18:15 crc kubenswrapper[4628]: I1211 05:18:15.548232 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:18:15 crc kubenswrapper[4628]: I1211 05:18:15.915649 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:18:15 crc kubenswrapper[4628]: I1211 05:18:15.915720 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:18:15 crc kubenswrapper[4628]: I1211 05:18:15.925995 4628 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]log ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]etcd ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/generic-apiserver-start-informers ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/priority-and-fairness-filter ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/start-apiextensions-informers ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/start-apiextensions-controllers ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/crd-informer-synced ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/start-system-namespaces-controller ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 11 05:18:15 crc kubenswrapper[4628]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 11 05:18:15 crc kubenswrapper[4628]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/bootstrap-controller ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/start-kube-aggregator-informers ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/apiservice-registration-controller ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/apiservice-discovery-controller ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]autoregister-completion ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/apiservice-openapi-controller ok Dec 11 05:18:15 crc kubenswrapper[4628]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 11 05:18:15 crc kubenswrapper[4628]: livez check failed Dec 11 05:18:15 crc kubenswrapper[4628]: I1211 05:18:15.926120 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 11 05:18:16 crc kubenswrapper[4628]: I1211 05:18:16.453180 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:18:20 crc kubenswrapper[4628]: I1211 05:18:20.460164 4628 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:18:20 crc kubenswrapper[4628]: I1211 05:18:20.921070 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:18:20 crc kubenswrapper[4628]: I1211 05:18:20.923971 4628 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e7e4a9ff-535a-4337-9a67-e8ce96a1778c" Dec 11 05:18:21 crc kubenswrapper[4628]: I1211 05:18:21.483829 4628 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a98947e-b435-4c9f-8356-537c79cc8b22" Dec 11 05:18:21 crc kubenswrapper[4628]: I1211 05:18:21.483905 4628 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a98947e-b435-4c9f-8356-537c79cc8b22" Dec 11 05:18:21 crc kubenswrapper[4628]: I1211 05:18:21.490466 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:18:22 crc kubenswrapper[4628]: I1211 05:18:22.490764 4628 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a98947e-b435-4c9f-8356-537c79cc8b22" Dec 11 05:18:22 crc kubenswrapper[4628]: I1211 05:18:22.490811 4628 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a98947e-b435-4c9f-8356-537c79cc8b22" Dec 11 05:18:27 crc kubenswrapper[4628]: I1211 05:18:27.920021 4628 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e7e4a9ff-535a-4337-9a67-e8ce96a1778c" Dec 11 05:18:30 crc kubenswrapper[4628]: I1211 05:18:30.306266 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 11 05:18:30 crc kubenswrapper[4628]: I1211 05:18:30.471394 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 11 05:18:30 crc kubenswrapper[4628]: I1211 05:18:30.585251 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 11 05:18:30 crc kubenswrapper[4628]: I1211 05:18:30.751390 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 11 05:18:30 crc kubenswrapper[4628]: I1211 05:18:30.898642 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.255401 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.292482 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.384419 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.427004 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.427097 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.491508 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.613940 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.618132 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.663191 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.693239 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.727699 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.819267 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 11 05:18:31 crc kubenswrapper[4628]: I1211 05:18:31.915277 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 11 05:18:32 crc kubenswrapper[4628]: I1211 05:18:32.141187 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 11 05:18:32 crc kubenswrapper[4628]: I1211 05:18:32.159482 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 11 05:18:32 crc kubenswrapper[4628]: I1211 05:18:32.191230 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 11 05:18:32 crc kubenswrapper[4628]: I1211 05:18:32.255167 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 11 05:18:32 crc kubenswrapper[4628]: I1211 05:18:32.334923 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 11 05:18:32 crc kubenswrapper[4628]: I1211 05:18:32.378380 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 11 05:18:32 crc kubenswrapper[4628]: I1211 05:18:32.979387 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.018706 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.029156 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.058668 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.110582 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.215029 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.226117 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.243782 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.276696 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.487663 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.549597 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.603009 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.712134 4628 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.713306 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zj2jl" podStartSLOduration=36.194073836 podStartE2EDuration="38.713281691s" podCreationTimestamp="2025-12-11 05:17:55 +0000 UTC" firstStartedPulling="2025-12-11 05:17:56.231369811 +0000 UTC m=+178.648716539" lastFinishedPulling="2025-12-11 05:17:58.750577696 +0000 UTC m=+181.167924394" observedRunningTime="2025-12-11 05:18:20.228635243 +0000 UTC m=+202.645981961" watchObservedRunningTime="2025-12-11 05:18:33.713281691 +0000 UTC m=+216.130628429" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.717418 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qszrs" podStartSLOduration=39.154451759 podStartE2EDuration="41.717398708s" podCreationTimestamp="2025-12-11 05:17:52 +0000 UTC" firstStartedPulling="2025-12-11 05:17:56.226881096 +0000 UTC m=+178.644227804" lastFinishedPulling="2025-12-11 05:17:58.789828055 +0000 UTC m=+181.207174753" observedRunningTime="2025-12-11 05:18:20.253428709 +0000 UTC m=+202.670775407" watchObservedRunningTime="2025-12-11 05:18:33.717398708 +0000 UTC m=+216.134745476" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.717897 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8bd7x" podStartSLOduration=40.154678072 podStartE2EDuration="41.71783047s" podCreationTimestamp="2025-12-11 05:17:52 +0000 UTC" firstStartedPulling="2025-12-11 05:17:56.221994031 +0000 UTC m=+178.639340759" lastFinishedPulling="2025-12-11 05:17:57.785146419 +0000 UTC m=+180.202493157" observedRunningTime="2025-12-11 05:18:20.285066248 +0000 UTC m=+202.702412946" watchObservedRunningTime="2025-12-11 05:18:33.71783047 +0000 UTC m=+216.135177218" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.726166 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/redhat-operators-mjcbm"] Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.726298 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.729433 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.755981 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=13.755955435 podStartE2EDuration="13.755955435s" podCreationTimestamp="2025-12-11 05:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:18:33.752003912 +0000 UTC m=+216.169350640" watchObservedRunningTime="2025-12-11 05:18:33.755955435 +0000 UTC m=+216.173302193" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.756723 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.770615 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.774116 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.804653 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.834417 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.899330 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" path="/var/lib/kubelet/pods/2a9eb6ef-92ff-415b-a526-26711b88985f/volumes" Dec 11 05:18:33 crc kubenswrapper[4628]: I1211 05:18:33.944341 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.022882 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.089549 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.147339 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.180323 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.199696 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.304677 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.309893 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.362825 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.462813 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.526231 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.595776 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.601753 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.689970 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.867407 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.889305 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 11 05:18:34 crc kubenswrapper[4628]: I1211 05:18:34.953568 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.046078 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.135902 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.188770 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.198242 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.293507 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.298030 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.386825 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.471411 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.506691 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.542595 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.571664 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.669759 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.705832 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.761482 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.762399 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.783029 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.832611 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.870858 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.882321 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.994725 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 11 05:18:35 crc kubenswrapper[4628]: I1211 05:18:35.995548 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.017806 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.102504 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.128351 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.140089 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.159413 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.457746 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.493910 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.494620 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.523426 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.540885 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.684482 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.695482 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.702979 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.743901 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.756987 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.832470 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.835743 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 11 05:18:36 crc kubenswrapper[4628]: I1211 05:18:36.869142 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 11 05:18:37 crc kubenswrapper[4628]: I1211 05:18:37.107232 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 11 05:18:37 crc kubenswrapper[4628]: I1211 05:18:37.110708 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 11 05:18:37 crc kubenswrapper[4628]: I1211 05:18:37.143686 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 11 05:18:37 crc kubenswrapper[4628]: I1211 05:18:37.240932 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 11 05:18:37 crc kubenswrapper[4628]: I1211 05:18:37.342487 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 11 05:18:37 crc kubenswrapper[4628]: I1211 05:18:37.453285 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 11 05:18:37 crc kubenswrapper[4628]: I1211 05:18:37.562792 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 11 05:18:37 crc kubenswrapper[4628]: I1211 05:18:37.580295 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 11 05:18:37 crc kubenswrapper[4628]: I1211 05:18:37.641156 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 11 05:18:37 crc kubenswrapper[4628]: I1211 05:18:37.644784 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 11 05:18:37 crc kubenswrapper[4628]: I1211 05:18:37.808639 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 11 05:18:37 crc kubenswrapper[4628]: I1211 05:18:37.973286 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.007161 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.040100 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.074292 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.168733 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.241338 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.260903 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.310761 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.456557 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.623682 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.672193 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.927961 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.937965 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.948911 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.976708 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 11 05:18:38 crc kubenswrapper[4628]: I1211 05:18:38.996834 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.029156 4628 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.065309 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.138355 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.235681 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.245103 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.262979 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.345578 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.351453 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.365956 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.376022 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.512638 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.561218 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.591998 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.605764 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.612920 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.652669 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.663496 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.683834 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.840839 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.892220 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 11 05:18:39 crc kubenswrapper[4628]: I1211 05:18:39.979948 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.058698 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.064546 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.075184 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.125024 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.175881 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.221764 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.229522 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.260494 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.313024 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.365962 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.408376 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.525794 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.589224 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.748469 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.760622 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.873791 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 11 05:18:40 crc kubenswrapper[4628]: I1211 05:18:40.920699 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.034048 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.108030 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.129539 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.157950 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.166310 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.258995 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.264704 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.309554 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.427323 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.501708 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.706826 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.718592 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.737707 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.741689 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.747928 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 11 05:18:41 crc kubenswrapper[4628]: I1211 05:18:41.783805 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.011002 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.032012 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.083258 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.130387 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.197025 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.198678 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.364442 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.402498 4628 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.417874 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.453273 4628 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.580658 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.600498 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.633477 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.680026 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.729974 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.889393 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.892018 4628 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.892350 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://df192c79190a0951e9429a8103abff2c2c5618e60acf5d38722c091c2a1574a6" gracePeriod=5 Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.954939 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 11 05:18:42 crc kubenswrapper[4628]: I1211 05:18:42.974182 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.029998 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.033638 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.033907 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.115488 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.163703 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.223904 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.318721 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.378154 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.441401 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.444069 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.445724 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.612912 4628 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.656994 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.676125 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.731676 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.765319 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.897310 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.907724 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 11 05:18:43 crc kubenswrapper[4628]: I1211 05:18:43.942076 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 11 05:18:44 crc kubenswrapper[4628]: I1211 05:18:44.012562 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 11 05:18:44 crc kubenswrapper[4628]: I1211 05:18:44.052216 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 11 05:18:44 crc kubenswrapper[4628]: I1211 05:18:44.123600 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 11 05:18:44 crc kubenswrapper[4628]: I1211 05:18:44.376385 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 11 05:18:44 crc kubenswrapper[4628]: I1211 05:18:44.393209 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 11 05:18:44 crc kubenswrapper[4628]: I1211 05:18:44.395188 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 11 05:18:44 crc kubenswrapper[4628]: I1211 05:18:44.431639 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 11 05:18:44 crc kubenswrapper[4628]: I1211 05:18:44.506571 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 11 05:18:44 crc kubenswrapper[4628]: I1211 05:18:44.518659 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 11 05:18:44 crc kubenswrapper[4628]: I1211 05:18:44.704477 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 11 05:18:44 crc kubenswrapper[4628]: I1211 05:18:44.849056 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 11 05:18:45 crc kubenswrapper[4628]: I1211 05:18:45.071242 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 11 05:18:45 crc kubenswrapper[4628]: I1211 05:18:45.219987 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 11 05:18:45 crc kubenswrapper[4628]: I1211 05:18:45.240650 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.472605 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.473398 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.582820 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.582893 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.582937 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.582952 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.583013 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.583076 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.583110 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.583110 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.583144 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.583508 4628 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.583546 4628 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.583563 4628 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.583576 4628 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.593434 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.645802 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.645886 4628 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="df192c79190a0951e9429a8103abff2c2c5618e60acf5d38722c091c2a1574a6" exitCode=137 Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.645932 4628 scope.go:117] "RemoveContainer" containerID="df192c79190a0951e9429a8103abff2c2c5618e60acf5d38722c091c2a1574a6" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.646043 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.668819 4628 scope.go:117] "RemoveContainer" containerID="df192c79190a0951e9429a8103abff2c2c5618e60acf5d38722c091c2a1574a6" Dec 11 05:18:48 crc kubenswrapper[4628]: E1211 05:18:48.669615 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df192c79190a0951e9429a8103abff2c2c5618e60acf5d38722c091c2a1574a6\": container with ID starting with df192c79190a0951e9429a8103abff2c2c5618e60acf5d38722c091c2a1574a6 not found: ID does not exist" containerID="df192c79190a0951e9429a8103abff2c2c5618e60acf5d38722c091c2a1574a6" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.669710 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df192c79190a0951e9429a8103abff2c2c5618e60acf5d38722c091c2a1574a6"} err="failed to get container status \"df192c79190a0951e9429a8103abff2c2c5618e60acf5d38722c091c2a1574a6\": rpc error: code = NotFound desc = could not find container \"df192c79190a0951e9429a8103abff2c2c5618e60acf5d38722c091c2a1574a6\": container with ID starting with df192c79190a0951e9429a8103abff2c2c5618e60acf5d38722c091c2a1574a6 not found: ID does not exist" Dec 11 05:18:48 crc kubenswrapper[4628]: I1211 05:18:48.684530 4628 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 11 05:18:49 crc kubenswrapper[4628]: I1211 05:18:49.901310 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 11 05:18:59 crc kubenswrapper[4628]: I1211 05:18:59.029896 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 11 05:19:01 crc kubenswrapper[4628]: I1211 05:19:01.426595 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:19:01 crc kubenswrapper[4628]: I1211 05:19:01.427137 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:19:01 crc kubenswrapper[4628]: I1211 05:19:01.427204 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:19:01 crc kubenswrapper[4628]: I1211 05:19:01.428035 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1ceb30e4a3d9e4f8a0cb5dd8e8ae33f28f9c75bc4c4706b76660db8785b07748"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 05:19:01 crc kubenswrapper[4628]: I1211 05:19:01.428135 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://1ceb30e4a3d9e4f8a0cb5dd8e8ae33f28f9c75bc4c4706b76660db8785b07748" gracePeriod=600 Dec 11 05:19:01 crc kubenswrapper[4628]: I1211 05:19:01.504217 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 11 05:19:01 crc kubenswrapper[4628]: I1211 05:19:01.710014 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="1ceb30e4a3d9e4f8a0cb5dd8e8ae33f28f9c75bc4c4706b76660db8785b07748" exitCode=0 Dec 11 05:19:01 crc kubenswrapper[4628]: I1211 05:19:01.710051 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"1ceb30e4a3d9e4f8a0cb5dd8e8ae33f28f9c75bc4c4706b76660db8785b07748"} Dec 11 05:19:01 crc kubenswrapper[4628]: I1211 05:19:01.710416 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"ddad4725ec6c3a9427422cdf04ad9742fa14cfafe1a3cf96a99beec112e27db7"} Dec 11 05:19:01 crc kubenswrapper[4628]: I1211 05:19:01.900627 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 11 05:19:02 crc kubenswrapper[4628]: I1211 05:19:02.291617 4628 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 11 05:19:03 crc kubenswrapper[4628]: I1211 05:19:03.238619 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 11 05:19:03 crc kubenswrapper[4628]: I1211 05:19:03.694758 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.066538 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8sttg"] Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.066888 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" podUID="d5a26a57-89a7-4c5c-902c-a19020e4a01a" containerName="controller-manager" containerID="cri-o://b63f7f7ad748d56c454956681b0aca20e830c39035f125d22e91829a5ca681dd" gracePeriod=30 Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.080376 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc"] Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.080633 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" podUID="e7210231-68df-4f2b-888f-90827f723bd2" containerName="route-controller-manager" containerID="cri-o://d08f39bc9250bf57aab70d4639a0fe533b75a52f3aebed1c3c4b6980c8efd834" gracePeriod=30 Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.725958 4628 generic.go:334] "Generic (PLEG): container finished" podID="d5a26a57-89a7-4c5c-902c-a19020e4a01a" containerID="b63f7f7ad748d56c454956681b0aca20e830c39035f125d22e91829a5ca681dd" exitCode=0 Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.726032 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" event={"ID":"d5a26a57-89a7-4c5c-902c-a19020e4a01a","Type":"ContainerDied","Data":"b63f7f7ad748d56c454956681b0aca20e830c39035f125d22e91829a5ca681dd"} Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.727411 4628 generic.go:334] "Generic (PLEG): container finished" podID="e7210231-68df-4f2b-888f-90827f723bd2" containerID="d08f39bc9250bf57aab70d4639a0fe533b75a52f3aebed1c3c4b6980c8efd834" exitCode=0 Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.727443 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" event={"ID":"e7210231-68df-4f2b-888f-90827f723bd2","Type":"ContainerDied","Data":"d08f39bc9250bf57aab70d4639a0fe533b75a52f3aebed1c3c4b6980c8efd834"} Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.985481 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.992040 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnfkc\" (UniqueName: \"kubernetes.io/projected/d5a26a57-89a7-4c5c-902c-a19020e4a01a-kube-api-access-fnfkc\") pod \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.992109 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-proxy-ca-bundles\") pod \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.992128 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-client-ca\") pod \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.992144 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-config\") pod \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.992187 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a26a57-89a7-4c5c-902c-a19020e4a01a-serving-cert\") pod \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\" (UID: \"d5a26a57-89a7-4c5c-902c-a19020e4a01a\") " Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.993018 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d5a26a57-89a7-4c5c-902c-a19020e4a01a" (UID: "d5a26a57-89a7-4c5c-902c-a19020e4a01a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.993044 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-client-ca" (OuterVolumeSpecName: "client-ca") pod "d5a26a57-89a7-4c5c-902c-a19020e4a01a" (UID: "d5a26a57-89a7-4c5c-902c-a19020e4a01a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:19:04 crc kubenswrapper[4628]: I1211 05:19:04.993077 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-config" (OuterVolumeSpecName: "config") pod "d5a26a57-89a7-4c5c-902c-a19020e4a01a" (UID: "d5a26a57-89a7-4c5c-902c-a19020e4a01a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.016630 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a26a57-89a7-4c5c-902c-a19020e4a01a-kube-api-access-fnfkc" (OuterVolumeSpecName: "kube-api-access-fnfkc") pod "d5a26a57-89a7-4c5c-902c-a19020e4a01a" (UID: "d5a26a57-89a7-4c5c-902c-a19020e4a01a"). InnerVolumeSpecName "kube-api-access-fnfkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.020234 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5a26a57-89a7-4c5c-902c-a19020e4a01a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d5a26a57-89a7-4c5c-902c-a19020e4a01a" (UID: "d5a26a57-89a7-4c5c-902c-a19020e4a01a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.098601 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnfkc\" (UniqueName: \"kubernetes.io/projected/d5a26a57-89a7-4c5c-902c-a19020e4a01a-kube-api-access-fnfkc\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.098628 4628 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.098637 4628 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.098646 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a26a57-89a7-4c5c-902c-a19020e4a01a-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.098657 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a26a57-89a7-4c5c-902c-a19020e4a01a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.153479 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.199996 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7210231-68df-4f2b-888f-90827f723bd2-config\") pod \"e7210231-68df-4f2b-888f-90827f723bd2\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.200040 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7210231-68df-4f2b-888f-90827f723bd2-client-ca\") pod \"e7210231-68df-4f2b-888f-90827f723bd2\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.200095 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7210231-68df-4f2b-888f-90827f723bd2-serving-cert\") pod \"e7210231-68df-4f2b-888f-90827f723bd2\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.200118 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdzh8\" (UniqueName: \"kubernetes.io/projected/e7210231-68df-4f2b-888f-90827f723bd2-kube-api-access-pdzh8\") pod \"e7210231-68df-4f2b-888f-90827f723bd2\" (UID: \"e7210231-68df-4f2b-888f-90827f723bd2\") " Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.201199 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7210231-68df-4f2b-888f-90827f723bd2-client-ca" (OuterVolumeSpecName: "client-ca") pod "e7210231-68df-4f2b-888f-90827f723bd2" (UID: "e7210231-68df-4f2b-888f-90827f723bd2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.201654 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7210231-68df-4f2b-888f-90827f723bd2-config" (OuterVolumeSpecName: "config") pod "e7210231-68df-4f2b-888f-90827f723bd2" (UID: "e7210231-68df-4f2b-888f-90827f723bd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.212194 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7210231-68df-4f2b-888f-90827f723bd2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7210231-68df-4f2b-888f-90827f723bd2" (UID: "e7210231-68df-4f2b-888f-90827f723bd2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.212297 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7210231-68df-4f2b-888f-90827f723bd2-kube-api-access-pdzh8" (OuterVolumeSpecName: "kube-api-access-pdzh8") pod "e7210231-68df-4f2b-888f-90827f723bd2" (UID: "e7210231-68df-4f2b-888f-90827f723bd2"). InnerVolumeSpecName "kube-api-access-pdzh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.301721 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7210231-68df-4f2b-888f-90827f723bd2-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.301782 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdzh8\" (UniqueName: \"kubernetes.io/projected/e7210231-68df-4f2b-888f-90827f723bd2-kube-api-access-pdzh8\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.301793 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7210231-68df-4f2b-888f-90827f723bd2-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.301802 4628 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7210231-68df-4f2b-888f-90827f723bd2-client-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.753631 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.753628 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc" event={"ID":"e7210231-68df-4f2b-888f-90827f723bd2","Type":"ContainerDied","Data":"76760e00231de5b4f5c15f721f8952d34cef584ffd2033118fe03c6600327a25"} Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.753777 4628 scope.go:117] "RemoveContainer" containerID="d08f39bc9250bf57aab70d4639a0fe533b75a52f3aebed1c3c4b6980c8efd834" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.755148 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" event={"ID":"d5a26a57-89a7-4c5c-902c-a19020e4a01a","Type":"ContainerDied","Data":"c9d6d32a5bf1c8bcb6592efb173becfddc354f0563c037dbc40b1eca4c0c1ffa"} Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.755214 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8sttg" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.770068 4628 scope.go:117] "RemoveContainer" containerID="b63f7f7ad748d56c454956681b0aca20e830c39035f125d22e91829a5ca681dd" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.784528 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc"] Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.794953 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ns4xc"] Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.800857 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8sttg"] Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.808370 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8sttg"] Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.894902 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5a26a57-89a7-4c5c-902c-a19020e4a01a" path="/var/lib/kubelet/pods/d5a26a57-89a7-4c5c-902c-a19020e4a01a/volumes" Dec 11 05:19:05 crc kubenswrapper[4628]: I1211 05:19:05.895520 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7210231-68df-4f2b-888f-90827f723bd2" path="/var/lib/kubelet/pods/e7210231-68df-4f2b-888f-90827f723bd2/volumes" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.413361 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77b768d98c-tbg9q"] Dec 11 05:19:06 crc kubenswrapper[4628]: E1211 05:19:06.413819 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.413831 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 11 05:19:06 crc kubenswrapper[4628]: E1211 05:19:06.413868 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" containerName="registry-server" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.413875 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" containerName="registry-server" Dec 11 05:19:06 crc kubenswrapper[4628]: E1211 05:19:06.413888 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a26a57-89a7-4c5c-902c-a19020e4a01a" containerName="controller-manager" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.413895 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a26a57-89a7-4c5c-902c-a19020e4a01a" containerName="controller-manager" Dec 11 05:19:06 crc kubenswrapper[4628]: E1211 05:19:06.413905 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" containerName="extract-utilities" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.413911 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" containerName="extract-utilities" Dec 11 05:19:06 crc kubenswrapper[4628]: E1211 05:19:06.413918 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" containerName="installer" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.413924 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" containerName="installer" Dec 11 05:19:06 crc kubenswrapper[4628]: E1211 05:19:06.413933 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" containerName="extract-content" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.413938 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" containerName="extract-content" Dec 11 05:19:06 crc kubenswrapper[4628]: E1211 05:19:06.413947 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7210231-68df-4f2b-888f-90827f723bd2" containerName="route-controller-manager" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.413952 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7210231-68df-4f2b-888f-90827f723bd2" containerName="route-controller-manager" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.414038 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.414052 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="843a2bbd-6914-4686-a14e-f05f88ddcc07" containerName="installer" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.414061 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7210231-68df-4f2b-888f-90827f723bd2" containerName="route-controller-manager" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.414068 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a26a57-89a7-4c5c-902c-a19020e4a01a" containerName="controller-manager" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.414076 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a9eb6ef-92ff-415b-a526-26711b88985f" containerName="registry-server" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.414473 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.422601 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.422923 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.422979 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.422980 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.423331 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.425139 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.426612 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw"] Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.427249 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.433877 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.434302 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.434527 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.435679 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.436120 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw"] Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.436335 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.436662 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.439810 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.450312 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77b768d98c-tbg9q"] Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.515309 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88c30408-54be-4665-9531-a8a986874961-client-ca\") pod \"route-controller-manager-5f665f7bd4-ndlzw\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.515429 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88c30408-54be-4665-9531-a8a986874961-serving-cert\") pod \"route-controller-manager-5f665f7bd4-ndlzw\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.515478 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8b8g\" (UniqueName: \"kubernetes.io/projected/88c30408-54be-4665-9531-a8a986874961-kube-api-access-c8b8g\") pod \"route-controller-manager-5f665f7bd4-ndlzw\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.515547 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-proxy-ca-bundles\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.515593 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-config\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.515634 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-client-ca\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.515689 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3952f293-0b50-4dea-a3d3-6626f6dbd853-serving-cert\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.515717 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt9tb\" (UniqueName: \"kubernetes.io/projected/3952f293-0b50-4dea-a3d3-6626f6dbd853-kube-api-access-wt9tb\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.515739 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c30408-54be-4665-9531-a8a986874961-config\") pod \"route-controller-manager-5f665f7bd4-ndlzw\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.616763 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-proxy-ca-bundles\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.617128 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-config\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.617218 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-client-ca\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.617312 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3952f293-0b50-4dea-a3d3-6626f6dbd853-serving-cert\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.617428 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt9tb\" (UniqueName: \"kubernetes.io/projected/3952f293-0b50-4dea-a3d3-6626f6dbd853-kube-api-access-wt9tb\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.617536 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c30408-54be-4665-9531-a8a986874961-config\") pod \"route-controller-manager-5f665f7bd4-ndlzw\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.617623 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88c30408-54be-4665-9531-a8a986874961-client-ca\") pod \"route-controller-manager-5f665f7bd4-ndlzw\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.617700 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88c30408-54be-4665-9531-a8a986874961-serving-cert\") pod \"route-controller-manager-5f665f7bd4-ndlzw\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.617773 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8b8g\" (UniqueName: \"kubernetes.io/projected/88c30408-54be-4665-9531-a8a986874961-kube-api-access-c8b8g\") pod \"route-controller-manager-5f665f7bd4-ndlzw\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.618186 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-client-ca\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.618536 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88c30408-54be-4665-9531-a8a986874961-client-ca\") pod \"route-controller-manager-5f665f7bd4-ndlzw\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.618594 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-proxy-ca-bundles\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.618791 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c30408-54be-4665-9531-a8a986874961-config\") pod \"route-controller-manager-5f665f7bd4-ndlzw\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.618909 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-config\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.622754 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3952f293-0b50-4dea-a3d3-6626f6dbd853-serving-cert\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.623436 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88c30408-54be-4665-9531-a8a986874961-serving-cert\") pod \"route-controller-manager-5f665f7bd4-ndlzw\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.640425 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt9tb\" (UniqueName: \"kubernetes.io/projected/3952f293-0b50-4dea-a3d3-6626f6dbd853-kube-api-access-wt9tb\") pod \"controller-manager-77b768d98c-tbg9q\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.644635 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8b8g\" (UniqueName: \"kubernetes.io/projected/88c30408-54be-4665-9531-a8a986874961-kube-api-access-c8b8g\") pod \"route-controller-manager-5f665f7bd4-ndlzw\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.730352 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.747658 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.950108 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw"] Dec 11 05:19:06 crc kubenswrapper[4628]: I1211 05:19:06.966136 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77b768d98c-tbg9q"] Dec 11 05:19:06 crc kubenswrapper[4628]: W1211 05:19:06.969561 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3952f293_0b50_4dea_a3d3_6626f6dbd853.slice/crio-ee66d8329c8c44d6530c0d63bab16160a0f29af513e8e66127e9b42b0ebb4220 WatchSource:0}: Error finding container ee66d8329c8c44d6530c0d63bab16160a0f29af513e8e66127e9b42b0ebb4220: Status 404 returned error can't find the container with id ee66d8329c8c44d6530c0d63bab16160a0f29af513e8e66127e9b42b0ebb4220 Dec 11 05:19:07 crc kubenswrapper[4628]: I1211 05:19:07.794060 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" event={"ID":"88c30408-54be-4665-9531-a8a986874961","Type":"ContainerStarted","Data":"b06c0dd41b932214afb95fc76d6b7227c6a2af7fe78799e1ff7d9d9aa8fdebc2"} Dec 11 05:19:07 crc kubenswrapper[4628]: I1211 05:19:07.794423 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:07 crc kubenswrapper[4628]: I1211 05:19:07.794438 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" event={"ID":"88c30408-54be-4665-9531-a8a986874961","Type":"ContainerStarted","Data":"fa563a73bf24c958e796aa02ca4d30e6c7e362cf62b17bd62134eda3fa94b9ec"} Dec 11 05:19:07 crc kubenswrapper[4628]: I1211 05:19:07.796182 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" event={"ID":"3952f293-0b50-4dea-a3d3-6626f6dbd853","Type":"ContainerStarted","Data":"cf7e2ac1433c2fafeaff4ac295a61820181bde423ee678ad0b6e7579ce203efb"} Dec 11 05:19:07 crc kubenswrapper[4628]: I1211 05:19:07.796215 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" event={"ID":"3952f293-0b50-4dea-a3d3-6626f6dbd853","Type":"ContainerStarted","Data":"ee66d8329c8c44d6530c0d63bab16160a0f29af513e8e66127e9b42b0ebb4220"} Dec 11 05:19:07 crc kubenswrapper[4628]: I1211 05:19:07.796374 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:07 crc kubenswrapper[4628]: I1211 05:19:07.811959 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:19:07 crc kubenswrapper[4628]: I1211 05:19:07.845204 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" podStartSLOduration=3.8451853 podStartE2EDuration="3.8451853s" podCreationTimestamp="2025-12-11 05:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:19:07.833997461 +0000 UTC m=+250.251344159" watchObservedRunningTime="2025-12-11 05:19:07.8451853 +0000 UTC m=+250.262531998" Dec 11 05:19:07 crc kubenswrapper[4628]: I1211 05:19:07.855670 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 11 05:19:07 crc kubenswrapper[4628]: I1211 05:19:07.867045 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 11 05:19:08 crc kubenswrapper[4628]: I1211 05:19:08.090025 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 11 05:19:08 crc kubenswrapper[4628]: I1211 05:19:08.110874 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:08 crc kubenswrapper[4628]: I1211 05:19:08.140197 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" podStartSLOduration=4.14017952 podStartE2EDuration="4.14017952s" podCreationTimestamp="2025-12-11 05:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:19:07.907019908 +0000 UTC m=+250.324366606" watchObservedRunningTime="2025-12-11 05:19:08.14017952 +0000 UTC m=+250.557526218" Dec 11 05:19:09 crc kubenswrapper[4628]: I1211 05:19:09.042597 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 11 05:19:09 crc kubenswrapper[4628]: I1211 05:19:09.767696 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw"] Dec 11 05:19:09 crc kubenswrapper[4628]: I1211 05:19:09.814440 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 11 05:19:10 crc kubenswrapper[4628]: I1211 05:19:10.747594 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 11 05:19:10 crc kubenswrapper[4628]: I1211 05:19:10.808521 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" podUID="88c30408-54be-4665-9531-a8a986874961" containerName="route-controller-manager" containerID="cri-o://b06c0dd41b932214afb95fc76d6b7227c6a2af7fe78799e1ff7d9d9aa8fdebc2" gracePeriod=30 Dec 11 05:19:11 crc kubenswrapper[4628]: I1211 05:19:11.713193 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 11 05:19:12 crc kubenswrapper[4628]: I1211 05:19:12.355475 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 11 05:19:12 crc kubenswrapper[4628]: I1211 05:19:12.822837 4628 generic.go:334] "Generic (PLEG): container finished" podID="88c30408-54be-4665-9531-a8a986874961" containerID="b06c0dd41b932214afb95fc76d6b7227c6a2af7fe78799e1ff7d9d9aa8fdebc2" exitCode=0 Dec 11 05:19:12 crc kubenswrapper[4628]: I1211 05:19:12.822888 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" event={"ID":"88c30408-54be-4665-9531-a8a986874961","Type":"ContainerDied","Data":"b06c0dd41b932214afb95fc76d6b7227c6a2af7fe78799e1ff7d9d9aa8fdebc2"} Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.156716 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.188626 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2"] Dec 11 05:19:13 crc kubenswrapper[4628]: E1211 05:19:13.188969 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c30408-54be-4665-9531-a8a986874961" containerName="route-controller-manager" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.188992 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c30408-54be-4665-9531-a8a986874961" containerName="route-controller-manager" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.189148 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c30408-54be-4665-9531-a8a986874961" containerName="route-controller-manager" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.189738 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.199770 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2"] Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.210230 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8b8g\" (UniqueName: \"kubernetes.io/projected/88c30408-54be-4665-9531-a8a986874961-kube-api-access-c8b8g\") pod \"88c30408-54be-4665-9531-a8a986874961\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.210353 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88c30408-54be-4665-9531-a8a986874961-client-ca\") pod \"88c30408-54be-4665-9531-a8a986874961\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.210392 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c30408-54be-4665-9531-a8a986874961-config\") pod \"88c30408-54be-4665-9531-a8a986874961\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.210437 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88c30408-54be-4665-9531-a8a986874961-serving-cert\") pod \"88c30408-54be-4665-9531-a8a986874961\" (UID: \"88c30408-54be-4665-9531-a8a986874961\") " Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.210563 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb8cf\" (UniqueName: \"kubernetes.io/projected/4f109bad-8750-4141-9a03-31896c5c8a0f-kube-api-access-cb8cf\") pod \"route-controller-manager-575556bbc6-mv8z2\" (UID: \"4f109bad-8750-4141-9a03-31896c5c8a0f\") " pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.210632 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f109bad-8750-4141-9a03-31896c5c8a0f-config\") pod \"route-controller-manager-575556bbc6-mv8z2\" (UID: \"4f109bad-8750-4141-9a03-31896c5c8a0f\") " pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.210671 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f109bad-8750-4141-9a03-31896c5c8a0f-serving-cert\") pod \"route-controller-manager-575556bbc6-mv8z2\" (UID: \"4f109bad-8750-4141-9a03-31896c5c8a0f\") " pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.210698 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f109bad-8750-4141-9a03-31896c5c8a0f-client-ca\") pod \"route-controller-manager-575556bbc6-mv8z2\" (UID: \"4f109bad-8750-4141-9a03-31896c5c8a0f\") " pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.211179 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88c30408-54be-4665-9531-a8a986874961-client-ca" (OuterVolumeSpecName: "client-ca") pod "88c30408-54be-4665-9531-a8a986874961" (UID: "88c30408-54be-4665-9531-a8a986874961"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.213400 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88c30408-54be-4665-9531-a8a986874961-config" (OuterVolumeSpecName: "config") pod "88c30408-54be-4665-9531-a8a986874961" (UID: "88c30408-54be-4665-9531-a8a986874961"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.217100 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88c30408-54be-4665-9531-a8a986874961-kube-api-access-c8b8g" (OuterVolumeSpecName: "kube-api-access-c8b8g") pod "88c30408-54be-4665-9531-a8a986874961" (UID: "88c30408-54be-4665-9531-a8a986874961"). InnerVolumeSpecName "kube-api-access-c8b8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.222599 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88c30408-54be-4665-9531-a8a986874961-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "88c30408-54be-4665-9531-a8a986874961" (UID: "88c30408-54be-4665-9531-a8a986874961"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.311870 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f109bad-8750-4141-9a03-31896c5c8a0f-config\") pod \"route-controller-manager-575556bbc6-mv8z2\" (UID: \"4f109bad-8750-4141-9a03-31896c5c8a0f\") " pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.311928 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f109bad-8750-4141-9a03-31896c5c8a0f-serving-cert\") pod \"route-controller-manager-575556bbc6-mv8z2\" (UID: \"4f109bad-8750-4141-9a03-31896c5c8a0f\") " pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.311956 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f109bad-8750-4141-9a03-31896c5c8a0f-client-ca\") pod \"route-controller-manager-575556bbc6-mv8z2\" (UID: \"4f109bad-8750-4141-9a03-31896c5c8a0f\") " pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.312007 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb8cf\" (UniqueName: \"kubernetes.io/projected/4f109bad-8750-4141-9a03-31896c5c8a0f-kube-api-access-cb8cf\") pod \"route-controller-manager-575556bbc6-mv8z2\" (UID: \"4f109bad-8750-4141-9a03-31896c5c8a0f\") " pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.312078 4628 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88c30408-54be-4665-9531-a8a986874961-client-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.312105 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88c30408-54be-4665-9531-a8a986874961-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.312115 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88c30408-54be-4665-9531-a8a986874961-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.312123 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8b8g\" (UniqueName: \"kubernetes.io/projected/88c30408-54be-4665-9531-a8a986874961-kube-api-access-c8b8g\") on node \"crc\" DevicePath \"\"" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.312940 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f109bad-8750-4141-9a03-31896c5c8a0f-config\") pod \"route-controller-manager-575556bbc6-mv8z2\" (UID: \"4f109bad-8750-4141-9a03-31896c5c8a0f\") " pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.313445 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f109bad-8750-4141-9a03-31896c5c8a0f-client-ca\") pod \"route-controller-manager-575556bbc6-mv8z2\" (UID: \"4f109bad-8750-4141-9a03-31896c5c8a0f\") " pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.319736 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f109bad-8750-4141-9a03-31896c5c8a0f-serving-cert\") pod \"route-controller-manager-575556bbc6-mv8z2\" (UID: \"4f109bad-8750-4141-9a03-31896c5c8a0f\") " pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.344441 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb8cf\" (UniqueName: \"kubernetes.io/projected/4f109bad-8750-4141-9a03-31896c5c8a0f-kube-api-access-cb8cf\") pod \"route-controller-manager-575556bbc6-mv8z2\" (UID: \"4f109bad-8750-4141-9a03-31896c5c8a0f\") " pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.515752 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.726885 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.829319 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" event={"ID":"88c30408-54be-4665-9531-a8a986874961","Type":"ContainerDied","Data":"fa563a73bf24c958e796aa02ca4d30e6c7e362cf62b17bd62134eda3fa94b9ec"} Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.829377 4628 scope.go:117] "RemoveContainer" containerID="b06c0dd41b932214afb95fc76d6b7227c6a2af7fe78799e1ff7d9d9aa8fdebc2" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.829407 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.862620 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw"] Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.865564 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f665f7bd4-ndlzw"] Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.894830 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88c30408-54be-4665-9531-a8a986874961" path="/var/lib/kubelet/pods/88c30408-54be-4665-9531-a8a986874961/volumes" Dec 11 05:19:13 crc kubenswrapper[4628]: I1211 05:19:13.955352 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2"] Dec 11 05:19:14 crc kubenswrapper[4628]: I1211 05:19:14.836891 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" event={"ID":"4f109bad-8750-4141-9a03-31896c5c8a0f","Type":"ContainerStarted","Data":"e97612012c16d6ca918998e691115dca340916c4ad1d7349f23558ac5392c76d"} Dec 11 05:19:14 crc kubenswrapper[4628]: I1211 05:19:14.836961 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" event={"ID":"4f109bad-8750-4141-9a03-31896c5c8a0f","Type":"ContainerStarted","Data":"3bb0274fd4c6f5de9303db3aeffd3f7e3e2549a7b3be482dd90857702ad781ef"} Dec 11 05:19:14 crc kubenswrapper[4628]: I1211 05:19:14.862083 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" podStartSLOduration=5.862061939 podStartE2EDuration="5.862061939s" podCreationTimestamp="2025-12-11 05:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:19:14.857656434 +0000 UTC m=+257.275003152" watchObservedRunningTime="2025-12-11 05:19:14.862061939 +0000 UTC m=+257.279408647" Dec 11 05:19:15 crc kubenswrapper[4628]: I1211 05:19:15.845671 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:15 crc kubenswrapper[4628]: I1211 05:19:15.853109 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-575556bbc6-mv8z2" Dec 11 05:19:16 crc kubenswrapper[4628]: I1211 05:19:16.701789 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 11 05:19:16 crc kubenswrapper[4628]: I1211 05:19:16.913615 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 11 05:19:18 crc kubenswrapper[4628]: I1211 05:19:18.375495 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 11 05:19:20 crc kubenswrapper[4628]: I1211 05:19:20.414126 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 11 05:19:20 crc kubenswrapper[4628]: I1211 05:19:20.617790 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.597685 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-867nx"] Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.599215 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.634860 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-867nx"] Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.699738 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.727407 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.727687 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b020a072-4db0-48f4-8121-25b30879d777-registry-tls\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.727799 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b020a072-4db0-48f4-8121-25b30879d777-bound-sa-token\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.727929 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b020a072-4db0-48f4-8121-25b30879d777-trusted-ca\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.728024 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b020a072-4db0-48f4-8121-25b30879d777-installation-pull-secrets\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.728095 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncmp7\" (UniqueName: \"kubernetes.io/projected/b020a072-4db0-48f4-8121-25b30879d777-kube-api-access-ncmp7\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.728178 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b020a072-4db0-48f4-8121-25b30879d777-registry-certificates\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.728258 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b020a072-4db0-48f4-8121-25b30879d777-ca-trust-extracted\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.752940 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.828942 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b020a072-4db0-48f4-8121-25b30879d777-trusted-ca\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.829215 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b020a072-4db0-48f4-8121-25b30879d777-installation-pull-secrets\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.829342 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncmp7\" (UniqueName: \"kubernetes.io/projected/b020a072-4db0-48f4-8121-25b30879d777-kube-api-access-ncmp7\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.829372 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b020a072-4db0-48f4-8121-25b30879d777-registry-certificates\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.829408 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b020a072-4db0-48f4-8121-25b30879d777-ca-trust-extracted\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.829467 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b020a072-4db0-48f4-8121-25b30879d777-registry-tls\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.829494 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b020a072-4db0-48f4-8121-25b30879d777-bound-sa-token\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.830730 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b020a072-4db0-48f4-8121-25b30879d777-trusted-ca\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.831148 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b020a072-4db0-48f4-8121-25b30879d777-registry-certificates\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.831476 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b020a072-4db0-48f4-8121-25b30879d777-ca-trust-extracted\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.837534 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b020a072-4db0-48f4-8121-25b30879d777-registry-tls\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.838879 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b020a072-4db0-48f4-8121-25b30879d777-installation-pull-secrets\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.848512 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncmp7\" (UniqueName: \"kubernetes.io/projected/b020a072-4db0-48f4-8121-25b30879d777-kube-api-access-ncmp7\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.852440 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b020a072-4db0-48f4-8121-25b30879d777-bound-sa-token\") pod \"image-registry-66df7c8f76-867nx\" (UID: \"b020a072-4db0-48f4-8121-25b30879d777\") " pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:22 crc kubenswrapper[4628]: I1211 05:19:22.923026 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:23 crc kubenswrapper[4628]: I1211 05:19:23.372072 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-867nx"] Dec 11 05:19:23 crc kubenswrapper[4628]: W1211 05:19:23.374410 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb020a072_4db0_48f4_8121_25b30879d777.slice/crio-b3842d472c100ebb14b7221be03e28d363720b7f4f9e95c3f940504178f08fa9 WatchSource:0}: Error finding container b3842d472c100ebb14b7221be03e28d363720b7f4f9e95c3f940504178f08fa9: Status 404 returned error can't find the container with id b3842d472c100ebb14b7221be03e28d363720b7f4f9e95c3f940504178f08fa9 Dec 11 05:19:23 crc kubenswrapper[4628]: I1211 05:19:23.675697 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 11 05:19:23 crc kubenswrapper[4628]: I1211 05:19:23.887241 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-867nx" event={"ID":"b020a072-4db0-48f4-8121-25b30879d777","Type":"ContainerStarted","Data":"064479872e6bb4a27ff3741f06e7f1060a107b8690ba4586fb906ff9b8b4b8dd"} Dec 11 05:19:23 crc kubenswrapper[4628]: I1211 05:19:23.887285 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-867nx" event={"ID":"b020a072-4db0-48f4-8121-25b30879d777","Type":"ContainerStarted","Data":"b3842d472c100ebb14b7221be03e28d363720b7f4f9e95c3f940504178f08fa9"} Dec 11 05:19:23 crc kubenswrapper[4628]: I1211 05:19:23.887786 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:23 crc kubenswrapper[4628]: I1211 05:19:23.913295 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-867nx" podStartSLOduration=1.913264468 podStartE2EDuration="1.913264468s" podCreationTimestamp="2025-12-11 05:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:19:23.907630857 +0000 UTC m=+266.324977585" watchObservedRunningTime="2025-12-11 05:19:23.913264468 +0000 UTC m=+266.330611186" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.154563 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v9ml8"] Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.155891 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.158973 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.179301 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v9ml8"] Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.263731 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1faf62cf-c5ee-426d-afb5-25a16930ddbd-catalog-content\") pod \"redhat-operators-v9ml8\" (UID: \"1faf62cf-c5ee-426d-afb5-25a16930ddbd\") " pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.264079 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7plz\" (UniqueName: \"kubernetes.io/projected/1faf62cf-c5ee-426d-afb5-25a16930ddbd-kube-api-access-d7plz\") pod \"redhat-operators-v9ml8\" (UID: \"1faf62cf-c5ee-426d-afb5-25a16930ddbd\") " pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.264308 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1faf62cf-c5ee-426d-afb5-25a16930ddbd-utilities\") pod \"redhat-operators-v9ml8\" (UID: \"1faf62cf-c5ee-426d-afb5-25a16930ddbd\") " pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.270011 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.365173 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1faf62cf-c5ee-426d-afb5-25a16930ddbd-utilities\") pod \"redhat-operators-v9ml8\" (UID: \"1faf62cf-c5ee-426d-afb5-25a16930ddbd\") " pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.365238 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1faf62cf-c5ee-426d-afb5-25a16930ddbd-catalog-content\") pod \"redhat-operators-v9ml8\" (UID: \"1faf62cf-c5ee-426d-afb5-25a16930ddbd\") " pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.365261 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7plz\" (UniqueName: \"kubernetes.io/projected/1faf62cf-c5ee-426d-afb5-25a16930ddbd-kube-api-access-d7plz\") pod \"redhat-operators-v9ml8\" (UID: \"1faf62cf-c5ee-426d-afb5-25a16930ddbd\") " pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.365790 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1faf62cf-c5ee-426d-afb5-25a16930ddbd-utilities\") pod \"redhat-operators-v9ml8\" (UID: \"1faf62cf-c5ee-426d-afb5-25a16930ddbd\") " pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.365815 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1faf62cf-c5ee-426d-afb5-25a16930ddbd-catalog-content\") pod \"redhat-operators-v9ml8\" (UID: \"1faf62cf-c5ee-426d-afb5-25a16930ddbd\") " pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.391715 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7plz\" (UniqueName: \"kubernetes.io/projected/1faf62cf-c5ee-426d-afb5-25a16930ddbd-kube-api-access-d7plz\") pod \"redhat-operators-v9ml8\" (UID: \"1faf62cf-c5ee-426d-afb5-25a16930ddbd\") " pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.533987 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:24 crc kubenswrapper[4628]: I1211 05:19:24.996153 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v9ml8"] Dec 11 05:19:25 crc kubenswrapper[4628]: I1211 05:19:25.903249 4628 generic.go:334] "Generic (PLEG): container finished" podID="1faf62cf-c5ee-426d-afb5-25a16930ddbd" containerID="d5578e10493470513e679215447de66f87f6b4a556f1116f11ad9c52f7743815" exitCode=0 Dec 11 05:19:25 crc kubenswrapper[4628]: I1211 05:19:25.903312 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9ml8" event={"ID":"1faf62cf-c5ee-426d-afb5-25a16930ddbd","Type":"ContainerDied","Data":"d5578e10493470513e679215447de66f87f6b4a556f1116f11ad9c52f7743815"} Dec 11 05:19:25 crc kubenswrapper[4628]: I1211 05:19:25.903621 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9ml8" event={"ID":"1faf62cf-c5ee-426d-afb5-25a16930ddbd","Type":"ContainerStarted","Data":"be89aafba5e8efbfb5839d5e99095bb237fc0f854463b77424a594c4cc5213a1"} Dec 11 05:19:26 crc kubenswrapper[4628]: I1211 05:19:26.269228 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 11 05:19:26 crc kubenswrapper[4628]: I1211 05:19:26.909052 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9ml8" event={"ID":"1faf62cf-c5ee-426d-afb5-25a16930ddbd","Type":"ContainerStarted","Data":"b5c55e3798c8ca8c5c5fea468a44d72f3b696ea6f6cdd0c8d911644274f6fe7d"} Dec 11 05:19:27 crc kubenswrapper[4628]: I1211 05:19:27.916311 4628 generic.go:334] "Generic (PLEG): container finished" podID="1faf62cf-c5ee-426d-afb5-25a16930ddbd" containerID="b5c55e3798c8ca8c5c5fea468a44d72f3b696ea6f6cdd0c8d911644274f6fe7d" exitCode=0 Dec 11 05:19:27 crc kubenswrapper[4628]: I1211 05:19:27.916354 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9ml8" event={"ID":"1faf62cf-c5ee-426d-afb5-25a16930ddbd","Type":"ContainerDied","Data":"b5c55e3798c8ca8c5c5fea468a44d72f3b696ea6f6cdd0c8d911644274f6fe7d"} Dec 11 05:19:28 crc kubenswrapper[4628]: I1211 05:19:28.925290 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9ml8" event={"ID":"1faf62cf-c5ee-426d-afb5-25a16930ddbd","Type":"ContainerStarted","Data":"6842b07eaa4a7ac39fe1f2d762ee3bc1058cd2e320c26eccac67a9c1a5580fd1"} Dec 11 05:19:28 crc kubenswrapper[4628]: I1211 05:19:28.946141 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v9ml8" podStartSLOduration=2.295789042 podStartE2EDuration="4.946122533s" podCreationTimestamp="2025-12-11 05:19:24 +0000 UTC" firstStartedPulling="2025-12-11 05:19:25.906486605 +0000 UTC m=+268.323833343" lastFinishedPulling="2025-12-11 05:19:28.556820096 +0000 UTC m=+270.974166834" observedRunningTime="2025-12-11 05:19:28.944675711 +0000 UTC m=+271.362022419" watchObservedRunningTime="2025-12-11 05:19:28.946122533 +0000 UTC m=+271.363469241" Dec 11 05:19:34 crc kubenswrapper[4628]: I1211 05:19:34.534155 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:34 crc kubenswrapper[4628]: I1211 05:19:34.534786 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:35 crc kubenswrapper[4628]: I1211 05:19:35.578525 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v9ml8" podUID="1faf62cf-c5ee-426d-afb5-25a16930ddbd" containerName="registry-server" probeResult="failure" output=< Dec 11 05:19:35 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 05:19:35 crc kubenswrapper[4628]: > Dec 11 05:19:42 crc kubenswrapper[4628]: I1211 05:19:42.928764 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-867nx" Dec 11 05:19:43 crc kubenswrapper[4628]: I1211 05:19:43.003917 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6qvrg"] Dec 11 05:19:44 crc kubenswrapper[4628]: I1211 05:19:44.612701 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:44 crc kubenswrapper[4628]: I1211 05:19:44.668233 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v9ml8" Dec 11 05:19:57 crc kubenswrapper[4628]: I1211 05:19:57.750179 4628 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.066100 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77b768d98c-tbg9q"] Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.066968 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" podUID="3952f293-0b50-4dea-a3d3-6626f6dbd853" containerName="controller-manager" containerID="cri-o://cf7e2ac1433c2fafeaff4ac295a61820181bde423ee678ad0b6e7579ce203efb" gracePeriod=30 Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.259819 4628 generic.go:334] "Generic (PLEG): container finished" podID="3952f293-0b50-4dea-a3d3-6626f6dbd853" containerID="cf7e2ac1433c2fafeaff4ac295a61820181bde423ee678ad0b6e7579ce203efb" exitCode=0 Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.259912 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" event={"ID":"3952f293-0b50-4dea-a3d3-6626f6dbd853","Type":"ContainerDied","Data":"cf7e2ac1433c2fafeaff4ac295a61820181bde423ee678ad0b6e7579ce203efb"} Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.512958 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.606592 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-config\") pod \"3952f293-0b50-4dea-a3d3-6626f6dbd853\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.606628 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-client-ca\") pod \"3952f293-0b50-4dea-a3d3-6626f6dbd853\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.606691 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-proxy-ca-bundles\") pod \"3952f293-0b50-4dea-a3d3-6626f6dbd853\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.606749 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt9tb\" (UniqueName: \"kubernetes.io/projected/3952f293-0b50-4dea-a3d3-6626f6dbd853-kube-api-access-wt9tb\") pod \"3952f293-0b50-4dea-a3d3-6626f6dbd853\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.606801 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3952f293-0b50-4dea-a3d3-6626f6dbd853-serving-cert\") pod \"3952f293-0b50-4dea-a3d3-6626f6dbd853\" (UID: \"3952f293-0b50-4dea-a3d3-6626f6dbd853\") " Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.608240 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-client-ca" (OuterVolumeSpecName: "client-ca") pod "3952f293-0b50-4dea-a3d3-6626f6dbd853" (UID: "3952f293-0b50-4dea-a3d3-6626f6dbd853"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.608903 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-config" (OuterVolumeSpecName: "config") pod "3952f293-0b50-4dea-a3d3-6626f6dbd853" (UID: "3952f293-0b50-4dea-a3d3-6626f6dbd853"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.609568 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3952f293-0b50-4dea-a3d3-6626f6dbd853" (UID: "3952f293-0b50-4dea-a3d3-6626f6dbd853"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.612083 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3952f293-0b50-4dea-a3d3-6626f6dbd853-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3952f293-0b50-4dea-a3d3-6626f6dbd853" (UID: "3952f293-0b50-4dea-a3d3-6626f6dbd853"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.612172 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3952f293-0b50-4dea-a3d3-6626f6dbd853-kube-api-access-wt9tb" (OuterVolumeSpecName: "kube-api-access-wt9tb") pod "3952f293-0b50-4dea-a3d3-6626f6dbd853" (UID: "3952f293-0b50-4dea-a3d3-6626f6dbd853"). InnerVolumeSpecName "kube-api-access-wt9tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.708071 4628 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.708291 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt9tb\" (UniqueName: \"kubernetes.io/projected/3952f293-0b50-4dea-a3d3-6626f6dbd853-kube-api-access-wt9tb\") on node \"crc\" DevicePath \"\"" Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.708363 4628 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3952f293-0b50-4dea-a3d3-6626f6dbd853-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.708420 4628 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-client-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:20:04 crc kubenswrapper[4628]: I1211 05:20:04.708481 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3952f293-0b50-4dea-a3d3-6626f6dbd853-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.269302 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" event={"ID":"3952f293-0b50-4dea-a3d3-6626f6dbd853","Type":"ContainerDied","Data":"ee66d8329c8c44d6530c0d63bab16160a0f29af513e8e66127e9b42b0ebb4220"} Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.269396 4628 scope.go:117] "RemoveContainer" containerID="cf7e2ac1433c2fafeaff4ac295a61820181bde423ee678ad0b6e7579ce203efb" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.269404 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b768d98c-tbg9q" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.331285 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77b768d98c-tbg9q"] Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.338640 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77b768d98c-tbg9q"] Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.654879 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-698b5b6b4b-xqm92"] Dec 11 05:20:05 crc kubenswrapper[4628]: E1211 05:20:05.655482 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3952f293-0b50-4dea-a3d3-6626f6dbd853" containerName="controller-manager" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.655740 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="3952f293-0b50-4dea-a3d3-6626f6dbd853" containerName="controller-manager" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.656066 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="3952f293-0b50-4dea-a3d3-6626f6dbd853" containerName="controller-manager" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.656831 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.661084 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.661399 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.662028 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.662829 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.663780 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.663879 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.678709 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.679398 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-698b5b6b4b-xqm92"] Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.723150 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30932922-5cb5-4153-9143-e4935d92eb82-serving-cert\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.723415 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30932922-5cb5-4153-9143-e4935d92eb82-client-ca\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.723635 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d95bz\" (UniqueName: \"kubernetes.io/projected/30932922-5cb5-4153-9143-e4935d92eb82-kube-api-access-d95bz\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.723836 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/30932922-5cb5-4153-9143-e4935d92eb82-proxy-ca-bundles\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.724100 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30932922-5cb5-4153-9143-e4935d92eb82-config\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.825510 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30932922-5cb5-4153-9143-e4935d92eb82-serving-cert\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.825572 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30932922-5cb5-4153-9143-e4935d92eb82-client-ca\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.825615 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d95bz\" (UniqueName: \"kubernetes.io/projected/30932922-5cb5-4153-9143-e4935d92eb82-kube-api-access-d95bz\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.825661 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/30932922-5cb5-4153-9143-e4935d92eb82-proxy-ca-bundles\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.825701 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30932922-5cb5-4153-9143-e4935d92eb82-config\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.827717 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30932922-5cb5-4153-9143-e4935d92eb82-config\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.827900 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30932922-5cb5-4153-9143-e4935d92eb82-client-ca\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.829392 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/30932922-5cb5-4153-9143-e4935d92eb82-proxy-ca-bundles\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.835230 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30932922-5cb5-4153-9143-e4935d92eb82-serving-cert\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.860193 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d95bz\" (UniqueName: \"kubernetes.io/projected/30932922-5cb5-4153-9143-e4935d92eb82-kube-api-access-d95bz\") pod \"controller-manager-698b5b6b4b-xqm92\" (UID: \"30932922-5cb5-4153-9143-e4935d92eb82\") " pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.902071 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3952f293-0b50-4dea-a3d3-6626f6dbd853" path="/var/lib/kubelet/pods/3952f293-0b50-4dea-a3d3-6626f6dbd853/volumes" Dec 11 05:20:05 crc kubenswrapper[4628]: I1211 05:20:05.991907 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:06 crc kubenswrapper[4628]: I1211 05:20:06.262512 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-698b5b6b4b-xqm92"] Dec 11 05:20:06 crc kubenswrapper[4628]: I1211 05:20:06.285531 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" event={"ID":"30932922-5cb5-4153-9143-e4935d92eb82","Type":"ContainerStarted","Data":"6d98bf43f10a8b2eda0b073d7218f1bc8dea4c817905b4e45e3fcce61b996d66"} Dec 11 05:20:07 crc kubenswrapper[4628]: I1211 05:20:07.298212 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" event={"ID":"30932922-5cb5-4153-9143-e4935d92eb82","Type":"ContainerStarted","Data":"afc44f557677d3303d79cbd5241f86af9702a7effea880d55c6ce71141de0d8b"} Dec 11 05:20:07 crc kubenswrapper[4628]: I1211 05:20:07.298610 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:07 crc kubenswrapper[4628]: I1211 05:20:07.307410 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" Dec 11 05:20:07 crc kubenswrapper[4628]: I1211 05:20:07.332023 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-698b5b6b4b-xqm92" podStartSLOduration=3.332000961 podStartE2EDuration="3.332000961s" podCreationTimestamp="2025-12-11 05:20:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:20:07.327457501 +0000 UTC m=+309.744804249" watchObservedRunningTime="2025-12-11 05:20:07.332000961 +0000 UTC m=+309.749347669" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.062888 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" podUID="fa963e29-dda2-4d61-827f-2da2d53bfe52" containerName="registry" containerID="cri-o://c230ba630b8d97f4a4826e1ff29a086a5a0da65b17f686c25ca5f3c18789357b" gracePeriod=30 Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.313216 4628 generic.go:334] "Generic (PLEG): container finished" podID="fa963e29-dda2-4d61-827f-2da2d53bfe52" containerID="c230ba630b8d97f4a4826e1ff29a086a5a0da65b17f686c25ca5f3c18789357b" exitCode=0 Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.314224 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" event={"ID":"fa963e29-dda2-4d61-827f-2da2d53bfe52","Type":"ContainerDied","Data":"c230ba630b8d97f4a4826e1ff29a086a5a0da65b17f686c25ca5f3c18789357b"} Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.485023 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.589393 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa963e29-dda2-4d61-827f-2da2d53bfe52-ca-trust-extracted\") pod \"fa963e29-dda2-4d61-827f-2da2d53bfe52\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.589580 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"fa963e29-dda2-4d61-827f-2da2d53bfe52\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.589633 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa963e29-dda2-4d61-827f-2da2d53bfe52-registry-certificates\") pod \"fa963e29-dda2-4d61-827f-2da2d53bfe52\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.589656 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-bound-sa-token\") pod \"fa963e29-dda2-4d61-827f-2da2d53bfe52\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.589697 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa963e29-dda2-4d61-827f-2da2d53bfe52-trusted-ca\") pod \"fa963e29-dda2-4d61-827f-2da2d53bfe52\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.589730 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa963e29-dda2-4d61-827f-2da2d53bfe52-installation-pull-secrets\") pod \"fa963e29-dda2-4d61-827f-2da2d53bfe52\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.589776 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkcsh\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-kube-api-access-nkcsh\") pod \"fa963e29-dda2-4d61-827f-2da2d53bfe52\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.589811 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-registry-tls\") pod \"fa963e29-dda2-4d61-827f-2da2d53bfe52\" (UID: \"fa963e29-dda2-4d61-827f-2da2d53bfe52\") " Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.590962 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa963e29-dda2-4d61-827f-2da2d53bfe52-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fa963e29-dda2-4d61-827f-2da2d53bfe52" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.591793 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa963e29-dda2-4d61-827f-2da2d53bfe52-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fa963e29-dda2-4d61-827f-2da2d53bfe52" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.595943 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fa963e29-dda2-4d61-827f-2da2d53bfe52" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.601045 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa963e29-dda2-4d61-827f-2da2d53bfe52-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fa963e29-dda2-4d61-827f-2da2d53bfe52" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.606059 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fa963e29-dda2-4d61-827f-2da2d53bfe52" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.607945 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa963e29-dda2-4d61-827f-2da2d53bfe52-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fa963e29-dda2-4d61-827f-2da2d53bfe52" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.608197 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-kube-api-access-nkcsh" (OuterVolumeSpecName: "kube-api-access-nkcsh") pod "fa963e29-dda2-4d61-827f-2da2d53bfe52" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52"). InnerVolumeSpecName "kube-api-access-nkcsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.613710 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "fa963e29-dda2-4d61-827f-2da2d53bfe52" (UID: "fa963e29-dda2-4d61-827f-2da2d53bfe52"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.691628 4628 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa963e29-dda2-4d61-827f-2da2d53bfe52-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.691663 4628 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa963e29-dda2-4d61-827f-2da2d53bfe52-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.691677 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkcsh\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-kube-api-access-nkcsh\") on node \"crc\" DevicePath \"\"" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.691685 4628 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.691693 4628 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa963e29-dda2-4d61-827f-2da2d53bfe52-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.691701 4628 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa963e29-dda2-4d61-827f-2da2d53bfe52-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 11 05:20:08 crc kubenswrapper[4628]: I1211 05:20:08.691709 4628 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa963e29-dda2-4d61-827f-2da2d53bfe52-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 11 05:20:09 crc kubenswrapper[4628]: I1211 05:20:09.325097 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" Dec 11 05:20:09 crc kubenswrapper[4628]: I1211 05:20:09.325492 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6qvrg" event={"ID":"fa963e29-dda2-4d61-827f-2da2d53bfe52","Type":"ContainerDied","Data":"a031c752421dd5ade06805969648aca8d48fa22fb298c058fcd6776f6598771f"} Dec 11 05:20:09 crc kubenswrapper[4628]: I1211 05:20:09.325991 4628 scope.go:117] "RemoveContainer" containerID="c230ba630b8d97f4a4826e1ff29a086a5a0da65b17f686c25ca5f3c18789357b" Dec 11 05:20:09 crc kubenswrapper[4628]: I1211 05:20:09.387932 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6qvrg"] Dec 11 05:20:09 crc kubenswrapper[4628]: I1211 05:20:09.397370 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6qvrg"] Dec 11 05:20:09 crc kubenswrapper[4628]: I1211 05:20:09.899157 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa963e29-dda2-4d61-827f-2da2d53bfe52" path="/var/lib/kubelet/pods/fa963e29-dda2-4d61-827f-2da2d53bfe52/volumes" Dec 11 05:21:01 crc kubenswrapper[4628]: I1211 05:21:01.426901 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:21:01 crc kubenswrapper[4628]: I1211 05:21:01.427486 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:21:31 crc kubenswrapper[4628]: I1211 05:21:31.427056 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:21:31 crc kubenswrapper[4628]: I1211 05:21:31.427626 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:22:01 crc kubenswrapper[4628]: I1211 05:22:01.426677 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:22:01 crc kubenswrapper[4628]: I1211 05:22:01.427324 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:22:01 crc kubenswrapper[4628]: I1211 05:22:01.427386 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:22:01 crc kubenswrapper[4628]: I1211 05:22:01.428155 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ddad4725ec6c3a9427422cdf04ad9742fa14cfafe1a3cf96a99beec112e27db7"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 05:22:01 crc kubenswrapper[4628]: I1211 05:22:01.428239 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://ddad4725ec6c3a9427422cdf04ad9742fa14cfafe1a3cf96a99beec112e27db7" gracePeriod=600 Dec 11 05:22:02 crc kubenswrapper[4628]: I1211 05:22:02.030316 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="ddad4725ec6c3a9427422cdf04ad9742fa14cfafe1a3cf96a99beec112e27db7" exitCode=0 Dec 11 05:22:02 crc kubenswrapper[4628]: I1211 05:22:02.030396 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"ddad4725ec6c3a9427422cdf04ad9742fa14cfafe1a3cf96a99beec112e27db7"} Dec 11 05:22:02 crc kubenswrapper[4628]: I1211 05:22:02.031189 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"96488dad0283d5c27c0403cf2393677a28a1af0afc44fbcf6fbe3d10bd0060af"} Dec 11 05:22:02 crc kubenswrapper[4628]: I1211 05:22:02.031240 4628 scope.go:117] "RemoveContainer" containerID="1ceb30e4a3d9e4f8a0cb5dd8e8ae33f28f9c75bc4c4706b76660db8785b07748" Dec 11 05:24:01 crc kubenswrapper[4628]: I1211 05:24:01.427424 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:24:01 crc kubenswrapper[4628]: I1211 05:24:01.428111 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.763482 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-2fjb7"] Dec 11 05:24:10 crc kubenswrapper[4628]: E1211 05:24:10.764239 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa963e29-dda2-4d61-827f-2da2d53bfe52" containerName="registry" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.764252 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa963e29-dda2-4d61-827f-2da2d53bfe52" containerName="registry" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.764334 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa963e29-dda2-4d61-827f-2da2d53bfe52" containerName="registry" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.764675 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-2fjb7" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.766703 4628 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-4thtv" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.767133 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.767370 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.776488 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-2fjb7"] Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.790874 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lf47f"] Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.791477 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-lf47f" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.795183 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-gfsxf"] Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.795737 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-gfsxf" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.797942 4628 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xvbwt" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.800608 4628 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-cj72m" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.812946 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-gfsxf"] Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.821387 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn99w\" (UniqueName: \"kubernetes.io/projected/e08b2800-f237-4960-a939-b24a4f34b340-kube-api-access-tn99w\") pod \"cert-manager-cainjector-7f985d654d-2fjb7\" (UID: \"e08b2800-f237-4960-a939-b24a4f34b340\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-2fjb7" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.821435 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhjv5\" (UniqueName: \"kubernetes.io/projected/2ab657c9-076c-4a39-9928-e92e8e276547-kube-api-access-zhjv5\") pod \"cert-manager-webhook-5655c58dd6-lf47f\" (UID: \"2ab657c9-076c-4a39-9928-e92e8e276547\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lf47f" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.821471 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k5jk\" (UniqueName: \"kubernetes.io/projected/6f9a6c48-1127-4e6c-bc88-133de5ba68e1-kube-api-access-4k5jk\") pod \"cert-manager-5b446d88c5-gfsxf\" (UID: \"6f9a6c48-1127-4e6c-bc88-133de5ba68e1\") " pod="cert-manager/cert-manager-5b446d88c5-gfsxf" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.843472 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lf47f"] Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.923218 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn99w\" (UniqueName: \"kubernetes.io/projected/e08b2800-f237-4960-a939-b24a4f34b340-kube-api-access-tn99w\") pod \"cert-manager-cainjector-7f985d654d-2fjb7\" (UID: \"e08b2800-f237-4960-a939-b24a4f34b340\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-2fjb7" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.923300 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhjv5\" (UniqueName: \"kubernetes.io/projected/2ab657c9-076c-4a39-9928-e92e8e276547-kube-api-access-zhjv5\") pod \"cert-manager-webhook-5655c58dd6-lf47f\" (UID: \"2ab657c9-076c-4a39-9928-e92e8e276547\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lf47f" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.923355 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k5jk\" (UniqueName: \"kubernetes.io/projected/6f9a6c48-1127-4e6c-bc88-133de5ba68e1-kube-api-access-4k5jk\") pod \"cert-manager-5b446d88c5-gfsxf\" (UID: \"6f9a6c48-1127-4e6c-bc88-133de5ba68e1\") " pod="cert-manager/cert-manager-5b446d88c5-gfsxf" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.943792 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhjv5\" (UniqueName: \"kubernetes.io/projected/2ab657c9-076c-4a39-9928-e92e8e276547-kube-api-access-zhjv5\") pod \"cert-manager-webhook-5655c58dd6-lf47f\" (UID: \"2ab657c9-076c-4a39-9928-e92e8e276547\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lf47f" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.943794 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k5jk\" (UniqueName: \"kubernetes.io/projected/6f9a6c48-1127-4e6c-bc88-133de5ba68e1-kube-api-access-4k5jk\") pod \"cert-manager-5b446d88c5-gfsxf\" (UID: \"6f9a6c48-1127-4e6c-bc88-133de5ba68e1\") " pod="cert-manager/cert-manager-5b446d88c5-gfsxf" Dec 11 05:24:10 crc kubenswrapper[4628]: I1211 05:24:10.944525 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn99w\" (UniqueName: \"kubernetes.io/projected/e08b2800-f237-4960-a939-b24a4f34b340-kube-api-access-tn99w\") pod \"cert-manager-cainjector-7f985d654d-2fjb7\" (UID: \"e08b2800-f237-4960-a939-b24a4f34b340\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-2fjb7" Dec 11 05:24:11 crc kubenswrapper[4628]: I1211 05:24:11.079396 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-2fjb7" Dec 11 05:24:11 crc kubenswrapper[4628]: I1211 05:24:11.112612 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-lf47f" Dec 11 05:24:11 crc kubenswrapper[4628]: I1211 05:24:11.115810 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-gfsxf" Dec 11 05:24:11 crc kubenswrapper[4628]: I1211 05:24:11.388049 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-gfsxf"] Dec 11 05:24:11 crc kubenswrapper[4628]: I1211 05:24:11.400858 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 05:24:11 crc kubenswrapper[4628]: I1211 05:24:11.562420 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lf47f"] Dec 11 05:24:11 crc kubenswrapper[4628]: W1211 05:24:11.566749 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ab657c9_076c_4a39_9928_e92e8e276547.slice/crio-e35f6e67848bdaa9259f3e8ce0dea9b065da806b4fe2b9164edccb22ecfe7a65 WatchSource:0}: Error finding container e35f6e67848bdaa9259f3e8ce0dea9b065da806b4fe2b9164edccb22ecfe7a65: Status 404 returned error can't find the container with id e35f6e67848bdaa9259f3e8ce0dea9b065da806b4fe2b9164edccb22ecfe7a65 Dec 11 05:24:11 crc kubenswrapper[4628]: I1211 05:24:11.571458 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-2fjb7"] Dec 11 05:24:11 crc kubenswrapper[4628]: I1211 05:24:11.918204 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-lf47f" event={"ID":"2ab657c9-076c-4a39-9928-e92e8e276547","Type":"ContainerStarted","Data":"e35f6e67848bdaa9259f3e8ce0dea9b065da806b4fe2b9164edccb22ecfe7a65"} Dec 11 05:24:11 crc kubenswrapper[4628]: I1211 05:24:11.919378 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-gfsxf" event={"ID":"6f9a6c48-1127-4e6c-bc88-133de5ba68e1","Type":"ContainerStarted","Data":"36b857c58c972702ce0c8a57d21a533ae161eda99e136b9bb270fe6428f96edd"} Dec 11 05:24:11 crc kubenswrapper[4628]: I1211 05:24:11.920378 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-2fjb7" event={"ID":"e08b2800-f237-4960-a939-b24a4f34b340","Type":"ContainerStarted","Data":"668b71e3549755f9019aa34715515004da2b0054852705e79476ccfe34eee9c5"} Dec 11 05:24:15 crc kubenswrapper[4628]: I1211 05:24:15.954473 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-lf47f" event={"ID":"2ab657c9-076c-4a39-9928-e92e8e276547","Type":"ContainerStarted","Data":"4d90a02fad171a435e6c8f7c3904df6c06188972dca66d3069e26837fa93714a"} Dec 11 05:24:15 crc kubenswrapper[4628]: I1211 05:24:15.955505 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-lf47f" Dec 11 05:24:15 crc kubenswrapper[4628]: I1211 05:24:15.956315 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-gfsxf" event={"ID":"6f9a6c48-1127-4e6c-bc88-133de5ba68e1","Type":"ContainerStarted","Data":"edae64012397fbbe16248e99ccb275a29b9babef75534709d5ab6f65992e0ab6"} Dec 11 05:24:15 crc kubenswrapper[4628]: I1211 05:24:15.957488 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-2fjb7" event={"ID":"e08b2800-f237-4960-a939-b24a4f34b340","Type":"ContainerStarted","Data":"5f8b40ec207f1bbda116fa4fac1313b7722a55cd394e853fe3263d9d1cc731b1"} Dec 11 05:24:15 crc kubenswrapper[4628]: I1211 05:24:15.981507 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-lf47f" podStartSLOduration=2.431369305 podStartE2EDuration="5.981481906s" podCreationTimestamp="2025-12-11 05:24:10 +0000 UTC" firstStartedPulling="2025-12-11 05:24:11.568913563 +0000 UTC m=+553.986260261" lastFinishedPulling="2025-12-11 05:24:15.119026164 +0000 UTC m=+557.536372862" observedRunningTime="2025-12-11 05:24:15.97451637 +0000 UTC m=+558.391863108" watchObservedRunningTime="2025-12-11 05:24:15.981481906 +0000 UTC m=+558.398828644" Dec 11 05:24:15 crc kubenswrapper[4628]: I1211 05:24:15.996957 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-2fjb7" podStartSLOduration=2.5296357130000002 podStartE2EDuration="5.996939736s" podCreationTimestamp="2025-12-11 05:24:10 +0000 UTC" firstStartedPulling="2025-12-11 05:24:11.586268604 +0000 UTC m=+554.003615292" lastFinishedPulling="2025-12-11 05:24:15.053572617 +0000 UTC m=+557.470919315" observedRunningTime="2025-12-11 05:24:15.996193426 +0000 UTC m=+558.413540214" watchObservedRunningTime="2025-12-11 05:24:15.996939736 +0000 UTC m=+558.414286444" Dec 11 05:24:16 crc kubenswrapper[4628]: I1211 05:24:16.018185 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-gfsxf" podStartSLOduration=2.363758321 podStartE2EDuration="6.018161839s" podCreationTimestamp="2025-12-11 05:24:10 +0000 UTC" firstStartedPulling="2025-12-11 05:24:11.400671748 +0000 UTC m=+553.818018446" lastFinishedPulling="2025-12-11 05:24:15.055075266 +0000 UTC m=+557.472421964" observedRunningTime="2025-12-11 05:24:16.011689018 +0000 UTC m=+558.429035716" watchObservedRunningTime="2025-12-11 05:24:16.018161839 +0000 UTC m=+558.435508557" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.118322 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-lf47f" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.227798 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r7545"] Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.228763 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="ovn-controller" containerID="cri-o://e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d" gracePeriod=30 Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.228881 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="nbdb" containerID="cri-o://d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa" gracePeriod=30 Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.228977 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="ovn-acl-logging" containerID="cri-o://860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d" gracePeriod=30 Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.229040 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="sbdb" containerID="cri-o://8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84" gracePeriod=30 Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.228997 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="kube-rbac-proxy-node" containerID="cri-o://7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59" gracePeriod=30 Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.229093 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f" gracePeriod=30 Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.229096 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="northd" containerID="cri-o://165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853" gracePeriod=30 Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.274199 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="ovnkube-controller" containerID="cri-o://1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085" gracePeriod=30 Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.522116 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7545_0904ad55-afbb-42a5-82e9-1f68c8d50a84/ovn-acl-logging/0.log" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.522530 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7545_0904ad55-afbb-42a5-82e9-1f68c8d50a84/ovn-controller/0.log" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.522918 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572023 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-ovn\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572071 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-run-netns\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572100 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovnkube-config\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572124 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovn-node-metrics-cert\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572152 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-systemd-units\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572168 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzvn7\" (UniqueName: \"kubernetes.io/projected/0904ad55-afbb-42a5-82e9-1f68c8d50a84-kube-api-access-pzvn7\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572166 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572185 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-slash\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572212 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572231 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-slash" (OuterVolumeSpecName: "host-slash") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572268 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-env-overrides\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572297 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-kubelet\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572300 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572329 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572340 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-run-ovn-kubernetes\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572396 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-node-log\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572436 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-cni-netd\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572461 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-systemd\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572495 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-etc-openvswitch\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572525 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-cni-bin\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572543 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-log-socket\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572581 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovnkube-script-lib\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572594 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572602 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-var-lib-openvswitch\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572629 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572637 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572667 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-openvswitch\") pod \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\" (UID: \"0904ad55-afbb-42a5-82e9-1f68c8d50a84\") " Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572878 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572952 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.572978 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573133 4628 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573151 4628 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573166 4628 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573177 4628 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573187 4628 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573197 4628 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573207 4628 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-slash\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573219 4628 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573230 4628 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573240 4628 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573272 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573301 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-log-socket" (OuterVolumeSpecName: "log-socket") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573629 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573668 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-node-log" (OuterVolumeSpecName: "node-log") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573691 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573688 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.573715 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.584344 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.585072 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0904ad55-afbb-42a5-82e9-1f68c8d50a84-kube-api-access-pzvn7" (OuterVolumeSpecName: "kube-api-access-pzvn7") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "kube-api-access-pzvn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.587798 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-28gsh"] Dec 11 05:24:21 crc kubenswrapper[4628]: E1211 05:24:21.588184 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="kubecfg-setup" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588208 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="kubecfg-setup" Dec 11 05:24:21 crc kubenswrapper[4628]: E1211 05:24:21.588221 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="ovnkube-controller" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588230 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="ovnkube-controller" Dec 11 05:24:21 crc kubenswrapper[4628]: E1211 05:24:21.588240 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="kube-rbac-proxy-ovn-metrics" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588249 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="kube-rbac-proxy-ovn-metrics" Dec 11 05:24:21 crc kubenswrapper[4628]: E1211 05:24:21.588262 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="northd" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588270 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="northd" Dec 11 05:24:21 crc kubenswrapper[4628]: E1211 05:24:21.588282 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="ovn-acl-logging" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588289 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="ovn-acl-logging" Dec 11 05:24:21 crc kubenswrapper[4628]: E1211 05:24:21.588300 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="nbdb" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588308 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="nbdb" Dec 11 05:24:21 crc kubenswrapper[4628]: E1211 05:24:21.588318 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="ovn-controller" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588327 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="ovn-controller" Dec 11 05:24:21 crc kubenswrapper[4628]: E1211 05:24:21.588339 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="kube-rbac-proxy-node" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588347 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="kube-rbac-proxy-node" Dec 11 05:24:21 crc kubenswrapper[4628]: E1211 05:24:21.588360 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="sbdb" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588371 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="sbdb" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588502 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="sbdb" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588516 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="ovnkube-controller" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588527 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="kube-rbac-proxy-ovn-metrics" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588538 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="nbdb" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588551 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="ovn-acl-logging" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588564 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="ovn-controller" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588575 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="northd" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.588586 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerName="kube-rbac-proxy-node" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.592305 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.596075 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "0904ad55-afbb-42a5-82e9-1f68c8d50a84" (UID: "0904ad55-afbb-42a5-82e9-1f68c8d50a84"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674229 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-node-log\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674288 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-run-ovn-kubernetes\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674321 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/54767ab5-546d-44ae-926d-2d593d8519f7-ovn-node-metrics-cert\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674345 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674376 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-etc-openvswitch\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674400 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-var-lib-openvswitch\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674424 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-log-socket\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674535 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/54767ab5-546d-44ae-926d-2d593d8519f7-ovnkube-script-lib\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674700 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-run-ovn\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674761 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-slash\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674786 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-cni-bin\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674827 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tww5\" (UniqueName: \"kubernetes.io/projected/54767ab5-546d-44ae-926d-2d593d8519f7-kube-api-access-8tww5\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674916 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-systemd-units\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.674981 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-cni-netd\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675055 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/54767ab5-546d-44ae-926d-2d593d8519f7-env-overrides\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675080 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-run-netns\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675118 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-run-systemd\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675136 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-run-openvswitch\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675171 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-kubelet\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675248 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/54767ab5-546d-44ae-926d-2d593d8519f7-ovnkube-config\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675324 4628 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-node-log\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675343 4628 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675353 4628 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675362 4628 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675370 4628 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-log-socket\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675381 4628 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675389 4628 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675400 4628 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0904ad55-afbb-42a5-82e9-1f68c8d50a84-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675410 4628 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0904ad55-afbb-42a5-82e9-1f68c8d50a84-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.675421 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzvn7\" (UniqueName: \"kubernetes.io/projected/0904ad55-afbb-42a5-82e9-1f68c8d50a84-kube-api-access-pzvn7\") on node \"crc\" DevicePath \"\"" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.777286 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tww5\" (UniqueName: \"kubernetes.io/projected/54767ab5-546d-44ae-926d-2d593d8519f7-kube-api-access-8tww5\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.777390 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-systemd-units\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.777430 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-cni-netd\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.777472 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/54767ab5-546d-44ae-926d-2d593d8519f7-env-overrides\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.777593 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-cni-netd\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.777604 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-systemd-units\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.777809 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-run-netns\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.777503 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-run-netns\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779096 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-run-systemd\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779122 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-run-openvswitch\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779171 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-kubelet\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779173 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-run-systemd\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779193 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/54767ab5-546d-44ae-926d-2d593d8519f7-ovnkube-config\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779273 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-node-log\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779320 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-run-ovn-kubernetes\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779366 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/54767ab5-546d-44ae-926d-2d593d8519f7-ovn-node-metrics-cert\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779412 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779442 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/54767ab5-546d-44ae-926d-2d593d8519f7-env-overrides\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779487 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-var-lib-openvswitch\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779452 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-var-lib-openvswitch\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779542 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-etc-openvswitch\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779545 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-run-openvswitch\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779586 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-log-socket\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779624 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/54767ab5-546d-44ae-926d-2d593d8519f7-ovnkube-script-lib\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779666 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-run-ovn\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779710 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-slash\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779731 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/54767ab5-546d-44ae-926d-2d593d8519f7-ovnkube-config\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779738 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-cni-bin\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779772 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-kubelet\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.779930 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-cni-bin\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.780273 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-node-log\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.780316 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.780346 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-etc-openvswitch\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.780373 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-log-socket\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.780405 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-slash\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.780430 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-host-run-ovn-kubernetes\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.780465 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/54767ab5-546d-44ae-926d-2d593d8519f7-run-ovn\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.781061 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/54767ab5-546d-44ae-926d-2d593d8519f7-ovnkube-script-lib\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.785025 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/54767ab5-546d-44ae-926d-2d593d8519f7-ovn-node-metrics-cert\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.794936 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tww5\" (UniqueName: \"kubernetes.io/projected/54767ab5-546d-44ae-926d-2d593d8519f7-kube-api-access-8tww5\") pod \"ovnkube-node-28gsh\" (UID: \"54767ab5-546d-44ae-926d-2d593d8519f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.906416 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.996115 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m7bbt_db022de3-87d1-493a-a77d-39d56bd83c22/kube-multus/0.log" Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.996378 4628 generic.go:334] "Generic (PLEG): container finished" podID="db022de3-87d1-493a-a77d-39d56bd83c22" containerID="55eb6510038209e1df420ab43dba3056f51d24eb63d8e27fad4be5844758bec6" exitCode=2 Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.996467 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m7bbt" event={"ID":"db022de3-87d1-493a-a77d-39d56bd83c22","Type":"ContainerDied","Data":"55eb6510038209e1df420ab43dba3056f51d24eb63d8e27fad4be5844758bec6"} Dec 11 05:24:21 crc kubenswrapper[4628]: I1211 05:24:21.996989 4628 scope.go:117] "RemoveContainer" containerID="55eb6510038209e1df420ab43dba3056f51d24eb63d8e27fad4be5844758bec6" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.003535 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7545_0904ad55-afbb-42a5-82e9-1f68c8d50a84/ovn-acl-logging/0.log" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004071 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r7545_0904ad55-afbb-42a5-82e9-1f68c8d50a84/ovn-controller/0.log" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004416 4628 generic.go:334] "Generic (PLEG): container finished" podID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerID="1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085" exitCode=0 Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004435 4628 generic.go:334] "Generic (PLEG): container finished" podID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerID="8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84" exitCode=0 Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004441 4628 generic.go:334] "Generic (PLEG): container finished" podID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerID="d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa" exitCode=0 Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004450 4628 generic.go:334] "Generic (PLEG): container finished" podID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerID="165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853" exitCode=0 Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004457 4628 generic.go:334] "Generic (PLEG): container finished" podID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerID="fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f" exitCode=0 Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004465 4628 generic.go:334] "Generic (PLEG): container finished" podID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerID="7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59" exitCode=0 Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004471 4628 generic.go:334] "Generic (PLEG): container finished" podID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerID="860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d" exitCode=143 Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004477 4628 generic.go:334] "Generic (PLEG): container finished" podID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" containerID="e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d" exitCode=143 Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004508 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerDied","Data":"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004532 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerDied","Data":"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004542 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerDied","Data":"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004551 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerDied","Data":"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004559 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerDied","Data":"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004570 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerDied","Data":"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004581 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004591 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004597 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004605 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerDied","Data":"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004612 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004618 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004623 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004628 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004633 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004638 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004643 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004648 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004653 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004660 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerDied","Data":"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004668 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004674 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004679 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004684 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004690 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004695 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004700 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004705 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004710 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004716 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" event={"ID":"0904ad55-afbb-42a5-82e9-1f68c8d50a84","Type":"ContainerDied","Data":"1141279842bfb3efe7f24ded16aaefd3db8a03eb6025a837f295bf8655f2e1a1"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004723 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004730 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004736 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004741 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004746 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004751 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004770 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004776 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004780 4628 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004802 4628 scope.go:117] "RemoveContainer" containerID="1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.004963 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r7545" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.006533 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" event={"ID":"54767ab5-546d-44ae-926d-2d593d8519f7","Type":"ContainerStarted","Data":"56f34e1458f04d5a6c7ca3798f0961b4989de6b17b08d472d2a760c1bc285022"} Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.033786 4628 scope.go:117] "RemoveContainer" containerID="8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.042751 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r7545"] Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.046804 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r7545"] Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.094982 4628 scope.go:117] "RemoveContainer" containerID="d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.108439 4628 scope.go:117] "RemoveContainer" containerID="165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.133127 4628 scope.go:117] "RemoveContainer" containerID="fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.157258 4628 scope.go:117] "RemoveContainer" containerID="7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.182924 4628 scope.go:117] "RemoveContainer" containerID="860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.197742 4628 scope.go:117] "RemoveContainer" containerID="e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.212021 4628 scope.go:117] "RemoveContainer" containerID="827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.226470 4628 scope.go:117] "RemoveContainer" containerID="1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085" Dec 11 05:24:22 crc kubenswrapper[4628]: E1211 05:24:22.227087 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085\": container with ID starting with 1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085 not found: ID does not exist" containerID="1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.227127 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085"} err="failed to get container status \"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085\": rpc error: code = NotFound desc = could not find container \"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085\": container with ID starting with 1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.227152 4628 scope.go:117] "RemoveContainer" containerID="8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84" Dec 11 05:24:22 crc kubenswrapper[4628]: E1211 05:24:22.227569 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84\": container with ID starting with 8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84 not found: ID does not exist" containerID="8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.227596 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84"} err="failed to get container status \"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84\": rpc error: code = NotFound desc = could not find container \"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84\": container with ID starting with 8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.227613 4628 scope.go:117] "RemoveContainer" containerID="d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa" Dec 11 05:24:22 crc kubenswrapper[4628]: E1211 05:24:22.228148 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa\": container with ID starting with d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa not found: ID does not exist" containerID="d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.228173 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa"} err="failed to get container status \"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa\": rpc error: code = NotFound desc = could not find container \"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa\": container with ID starting with d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.228188 4628 scope.go:117] "RemoveContainer" containerID="165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853" Dec 11 05:24:22 crc kubenswrapper[4628]: E1211 05:24:22.228513 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853\": container with ID starting with 165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853 not found: ID does not exist" containerID="165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.228543 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853"} err="failed to get container status \"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853\": rpc error: code = NotFound desc = could not find container \"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853\": container with ID starting with 165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.228566 4628 scope.go:117] "RemoveContainer" containerID="fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f" Dec 11 05:24:22 crc kubenswrapper[4628]: E1211 05:24:22.228942 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f\": container with ID starting with fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f not found: ID does not exist" containerID="fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.228974 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f"} err="failed to get container status \"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f\": rpc error: code = NotFound desc = could not find container \"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f\": container with ID starting with fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.228992 4628 scope.go:117] "RemoveContainer" containerID="7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59" Dec 11 05:24:22 crc kubenswrapper[4628]: E1211 05:24:22.229264 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59\": container with ID starting with 7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59 not found: ID does not exist" containerID="7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.229293 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59"} err="failed to get container status \"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59\": rpc error: code = NotFound desc = could not find container \"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59\": container with ID starting with 7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.229312 4628 scope.go:117] "RemoveContainer" containerID="860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d" Dec 11 05:24:22 crc kubenswrapper[4628]: E1211 05:24:22.229573 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d\": container with ID starting with 860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d not found: ID does not exist" containerID="860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.229599 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d"} err="failed to get container status \"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d\": rpc error: code = NotFound desc = could not find container \"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d\": container with ID starting with 860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.229616 4628 scope.go:117] "RemoveContainer" containerID="e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d" Dec 11 05:24:22 crc kubenswrapper[4628]: E1211 05:24:22.229913 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d\": container with ID starting with e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d not found: ID does not exist" containerID="e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.229937 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d"} err="failed to get container status \"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d\": rpc error: code = NotFound desc = could not find container \"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d\": container with ID starting with e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.229953 4628 scope.go:117] "RemoveContainer" containerID="827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767" Dec 11 05:24:22 crc kubenswrapper[4628]: E1211 05:24:22.230176 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767\": container with ID starting with 827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767 not found: ID does not exist" containerID="827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.230216 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767"} err="failed to get container status \"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767\": rpc error: code = NotFound desc = could not find container \"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767\": container with ID starting with 827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.230236 4628 scope.go:117] "RemoveContainer" containerID="1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.230799 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085"} err="failed to get container status \"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085\": rpc error: code = NotFound desc = could not find container \"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085\": container with ID starting with 1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.230823 4628 scope.go:117] "RemoveContainer" containerID="8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.231137 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84"} err="failed to get container status \"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84\": rpc error: code = NotFound desc = could not find container \"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84\": container with ID starting with 8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.231226 4628 scope.go:117] "RemoveContainer" containerID="d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.231555 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa"} err="failed to get container status \"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa\": rpc error: code = NotFound desc = could not find container \"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa\": container with ID starting with d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.231577 4628 scope.go:117] "RemoveContainer" containerID="165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.231911 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853"} err="failed to get container status \"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853\": rpc error: code = NotFound desc = could not find container \"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853\": container with ID starting with 165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.231947 4628 scope.go:117] "RemoveContainer" containerID="fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.232281 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f"} err="failed to get container status \"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f\": rpc error: code = NotFound desc = could not find container \"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f\": container with ID starting with fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.232317 4628 scope.go:117] "RemoveContainer" containerID="7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.232629 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59"} err="failed to get container status \"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59\": rpc error: code = NotFound desc = could not find container \"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59\": container with ID starting with 7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.232651 4628 scope.go:117] "RemoveContainer" containerID="860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.233035 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d"} err="failed to get container status \"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d\": rpc error: code = NotFound desc = could not find container \"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d\": container with ID starting with 860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.233059 4628 scope.go:117] "RemoveContainer" containerID="e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.233451 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d"} err="failed to get container status \"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d\": rpc error: code = NotFound desc = could not find container \"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d\": container with ID starting with e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.233474 4628 scope.go:117] "RemoveContainer" containerID="827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.233815 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767"} err="failed to get container status \"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767\": rpc error: code = NotFound desc = could not find container \"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767\": container with ID starting with 827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.233867 4628 scope.go:117] "RemoveContainer" containerID="1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.234155 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085"} err="failed to get container status \"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085\": rpc error: code = NotFound desc = could not find container \"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085\": container with ID starting with 1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.234192 4628 scope.go:117] "RemoveContainer" containerID="8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.234463 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84"} err="failed to get container status \"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84\": rpc error: code = NotFound desc = could not find container \"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84\": container with ID starting with 8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.234488 4628 scope.go:117] "RemoveContainer" containerID="d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.234819 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa"} err="failed to get container status \"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa\": rpc error: code = NotFound desc = could not find container \"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa\": container with ID starting with d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.234840 4628 scope.go:117] "RemoveContainer" containerID="165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.235168 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853"} err="failed to get container status \"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853\": rpc error: code = NotFound desc = could not find container \"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853\": container with ID starting with 165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.235186 4628 scope.go:117] "RemoveContainer" containerID="fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.235391 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f"} err="failed to get container status \"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f\": rpc error: code = NotFound desc = could not find container \"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f\": container with ID starting with fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.235409 4628 scope.go:117] "RemoveContainer" containerID="7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.235645 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59"} err="failed to get container status \"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59\": rpc error: code = NotFound desc = could not find container \"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59\": container with ID starting with 7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.235730 4628 scope.go:117] "RemoveContainer" containerID="860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.236142 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d"} err="failed to get container status \"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d\": rpc error: code = NotFound desc = could not find container \"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d\": container with ID starting with 860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.236213 4628 scope.go:117] "RemoveContainer" containerID="e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.236549 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d"} err="failed to get container status \"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d\": rpc error: code = NotFound desc = could not find container \"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d\": container with ID starting with e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.236621 4628 scope.go:117] "RemoveContainer" containerID="827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.236918 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767"} err="failed to get container status \"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767\": rpc error: code = NotFound desc = could not find container \"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767\": container with ID starting with 827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.237040 4628 scope.go:117] "RemoveContainer" containerID="1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.237353 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085"} err="failed to get container status \"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085\": rpc error: code = NotFound desc = could not find container \"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085\": container with ID starting with 1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.237507 4628 scope.go:117] "RemoveContainer" containerID="8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.237855 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84"} err="failed to get container status \"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84\": rpc error: code = NotFound desc = could not find container \"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84\": container with ID starting with 8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.237939 4628 scope.go:117] "RemoveContainer" containerID="d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.238247 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa"} err="failed to get container status \"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa\": rpc error: code = NotFound desc = could not find container \"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa\": container with ID starting with d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.238336 4628 scope.go:117] "RemoveContainer" containerID="165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.238644 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853"} err="failed to get container status \"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853\": rpc error: code = NotFound desc = could not find container \"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853\": container with ID starting with 165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.238736 4628 scope.go:117] "RemoveContainer" containerID="fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.239053 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f"} err="failed to get container status \"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f\": rpc error: code = NotFound desc = could not find container \"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f\": container with ID starting with fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.239158 4628 scope.go:117] "RemoveContainer" containerID="7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.239470 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59"} err="failed to get container status \"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59\": rpc error: code = NotFound desc = could not find container \"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59\": container with ID starting with 7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.239490 4628 scope.go:117] "RemoveContainer" containerID="860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.239744 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d"} err="failed to get container status \"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d\": rpc error: code = NotFound desc = could not find container \"860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d\": container with ID starting with 860b65170ccf52c2f7d130b3ab89fb17b53d48ac460290cfa05a3fa036d2003d not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.239835 4628 scope.go:117] "RemoveContainer" containerID="e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.240129 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d"} err="failed to get container status \"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d\": rpc error: code = NotFound desc = could not find container \"e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d\": container with ID starting with e3549dc0125a364e02e1b3176842203a505f0732c18e50330f72e97ccfd9a59d not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.240232 4628 scope.go:117] "RemoveContainer" containerID="827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.241034 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767"} err="failed to get container status \"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767\": rpc error: code = NotFound desc = could not find container \"827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767\": container with ID starting with 827d99bcd5e3b4ef5d1dec5476c6efceedd627b0a096955a4c7e01f129b5b767 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.241193 4628 scope.go:117] "RemoveContainer" containerID="1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.241584 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085"} err="failed to get container status \"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085\": rpc error: code = NotFound desc = could not find container \"1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085\": container with ID starting with 1d6a63da656bd42c66c63e93e0ff6b24c7a58e484b7fa9df990e775ed98c0085 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.241601 4628 scope.go:117] "RemoveContainer" containerID="8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.241876 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84"} err="failed to get container status \"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84\": rpc error: code = NotFound desc = could not find container \"8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84\": container with ID starting with 8ecd2c60cfc61f215df5b675959ae958999f3934a4810ab1a9f6417ba2472e84 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.241967 4628 scope.go:117] "RemoveContainer" containerID="d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.242269 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa"} err="failed to get container status \"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa\": rpc error: code = NotFound desc = could not find container \"d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa\": container with ID starting with d7c2a0cee614a419a7e49e845a5f159faa5574802239995316f566e546f039aa not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.242380 4628 scope.go:117] "RemoveContainer" containerID="165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.242699 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853"} err="failed to get container status \"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853\": rpc error: code = NotFound desc = could not find container \"165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853\": container with ID starting with 165943c9a5afbe9290ca3b7d243e250950095c380a380a201a5154649fd92853 not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.242777 4628 scope.go:117] "RemoveContainer" containerID="fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.243115 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f"} err="failed to get container status \"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f\": rpc error: code = NotFound desc = could not find container \"fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f\": container with ID starting with fa7fe96938880dac9893d08b9a1d44bff5d6d933fae1f13f355d105d4625063f not found: ID does not exist" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.243229 4628 scope.go:117] "RemoveContainer" containerID="7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59" Dec 11 05:24:22 crc kubenswrapper[4628]: I1211 05:24:22.243545 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59"} err="failed to get container status \"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59\": rpc error: code = NotFound desc = could not find container \"7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59\": container with ID starting with 7aa79ed3a402851263d08cf27066bde87f171101d284f05e997a50adc3ae7d59 not found: ID does not exist" Dec 11 05:24:23 crc kubenswrapper[4628]: I1211 05:24:23.014333 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m7bbt_db022de3-87d1-493a-a77d-39d56bd83c22/kube-multus/0.log" Dec 11 05:24:23 crc kubenswrapper[4628]: I1211 05:24:23.014611 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m7bbt" event={"ID":"db022de3-87d1-493a-a77d-39d56bd83c22","Type":"ContainerStarted","Data":"b3a58d30093aa0213f05ffdb7417969394a5df9ee3062f0c532299e7163b7537"} Dec 11 05:24:23 crc kubenswrapper[4628]: I1211 05:24:23.017724 4628 generic.go:334] "Generic (PLEG): container finished" podID="54767ab5-546d-44ae-926d-2d593d8519f7" containerID="62828134c3ebb484feb2ffca77ccac4150e2b213dd91fa810fc02720ee56a3b2" exitCode=0 Dec 11 05:24:23 crc kubenswrapper[4628]: I1211 05:24:23.017745 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" event={"ID":"54767ab5-546d-44ae-926d-2d593d8519f7","Type":"ContainerDied","Data":"62828134c3ebb484feb2ffca77ccac4150e2b213dd91fa810fc02720ee56a3b2"} Dec 11 05:24:23 crc kubenswrapper[4628]: I1211 05:24:23.901414 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0904ad55-afbb-42a5-82e9-1f68c8d50a84" path="/var/lib/kubelet/pods/0904ad55-afbb-42a5-82e9-1f68c8d50a84/volumes" Dec 11 05:24:24 crc kubenswrapper[4628]: I1211 05:24:24.025708 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" event={"ID":"54767ab5-546d-44ae-926d-2d593d8519f7","Type":"ContainerStarted","Data":"2f7a2cd388543d05990e96535558e5c781d27119af9f6d42fc98a73164c5ab7a"} Dec 11 05:24:24 crc kubenswrapper[4628]: I1211 05:24:24.025775 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" event={"ID":"54767ab5-546d-44ae-926d-2d593d8519f7","Type":"ContainerStarted","Data":"aed8bdc53bfcad8cc0d2e87e17dcd9a953ba8693e3e00d913843f376e69c7729"} Dec 11 05:24:24 crc kubenswrapper[4628]: I1211 05:24:24.025796 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" event={"ID":"54767ab5-546d-44ae-926d-2d593d8519f7","Type":"ContainerStarted","Data":"180fa914bdafb01ab6759597db2fcca645a84cf3f214f7bf255729c1f53a68ae"} Dec 11 05:24:24 crc kubenswrapper[4628]: I1211 05:24:24.025815 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" event={"ID":"54767ab5-546d-44ae-926d-2d593d8519f7","Type":"ContainerStarted","Data":"14623370122b145095cea4a5f78f46f6b54fb416d0687fcd2670d687cc777637"} Dec 11 05:24:24 crc kubenswrapper[4628]: I1211 05:24:24.025838 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" event={"ID":"54767ab5-546d-44ae-926d-2d593d8519f7","Type":"ContainerStarted","Data":"4004b2f68b02504a197e34d8ee8b21e7fb5432125fb3d5ba3962b61ce4884611"} Dec 11 05:24:24 crc kubenswrapper[4628]: I1211 05:24:24.025897 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" event={"ID":"54767ab5-546d-44ae-926d-2d593d8519f7","Type":"ContainerStarted","Data":"ece9a04a5a3cf1a4ea4e5539fca1e419767e06ea762505638f04f0b1c09d0f65"} Dec 11 05:24:27 crc kubenswrapper[4628]: I1211 05:24:27.058081 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" event={"ID":"54767ab5-546d-44ae-926d-2d593d8519f7","Type":"ContainerStarted","Data":"9102c4059ef97538c01c6dbd105efc5af67bb5cf0c668e40f9093e4302e2a807"} Dec 11 05:24:31 crc kubenswrapper[4628]: I1211 05:24:31.098628 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" event={"ID":"54767ab5-546d-44ae-926d-2d593d8519f7","Type":"ContainerStarted","Data":"20cbcdfebdc3ef8c8b579a767257e87a3c99dee0ed0baf18a7c927191f87371c"} Dec 11 05:24:31 crc kubenswrapper[4628]: I1211 05:24:31.426674 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:24:31 crc kubenswrapper[4628]: I1211 05:24:31.426759 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:24:32 crc kubenswrapper[4628]: I1211 05:24:32.103297 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:32 crc kubenswrapper[4628]: I1211 05:24:32.103629 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:32 crc kubenswrapper[4628]: I1211 05:24:32.136463 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:32 crc kubenswrapper[4628]: I1211 05:24:32.169475 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" podStartSLOduration=11.169455099 podStartE2EDuration="11.169455099s" podCreationTimestamp="2025-12-11 05:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:24:32.135961289 +0000 UTC m=+574.553307987" watchObservedRunningTime="2025-12-11 05:24:32.169455099 +0000 UTC m=+574.586801817" Dec 11 05:24:33 crc kubenswrapper[4628]: I1211 05:24:33.109247 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:33 crc kubenswrapper[4628]: I1211 05:24:33.138271 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:24:35 crc kubenswrapper[4628]: I1211 05:24:35.152507 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-28gsh" Dec 11 05:25:01 crc kubenswrapper[4628]: I1211 05:25:01.426796 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:25:01 crc kubenswrapper[4628]: I1211 05:25:01.427430 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:25:01 crc kubenswrapper[4628]: I1211 05:25:01.427481 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:25:01 crc kubenswrapper[4628]: I1211 05:25:01.428084 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96488dad0283d5c27c0403cf2393677a28a1af0afc44fbcf6fbe3d10bd0060af"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 05:25:01 crc kubenswrapper[4628]: I1211 05:25:01.428153 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://96488dad0283d5c27c0403cf2393677a28a1af0afc44fbcf6fbe3d10bd0060af" gracePeriod=600 Dec 11 05:25:02 crc kubenswrapper[4628]: I1211 05:25:02.306146 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="96488dad0283d5c27c0403cf2393677a28a1af0afc44fbcf6fbe3d10bd0060af" exitCode=0 Dec 11 05:25:02 crc kubenswrapper[4628]: I1211 05:25:02.306228 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"96488dad0283d5c27c0403cf2393677a28a1af0afc44fbcf6fbe3d10bd0060af"} Dec 11 05:25:02 crc kubenswrapper[4628]: I1211 05:25:02.307009 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"2edbf0424a7d52e635507a6262c52a38d0cf51657fa8d3615985b25f98b6c93c"} Dec 11 05:25:02 crc kubenswrapper[4628]: I1211 05:25:02.307079 4628 scope.go:117] "RemoveContainer" containerID="ddad4725ec6c3a9427422cdf04ad9742fa14cfafe1a3cf96a99beec112e27db7" Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.451687 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh"] Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.453791 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.455948 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.465948 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh"] Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.481206 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhxhk\" (UniqueName: \"kubernetes.io/projected/c0e6a71a-6351-4860-a562-05df960a3f2c-kube-api-access-vhxhk\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh\" (UID: \"c0e6a71a-6351-4860-a562-05df960a3f2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.481308 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e6a71a-6351-4860-a562-05df960a3f2c-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh\" (UID: \"c0e6a71a-6351-4860-a562-05df960a3f2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.481361 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e6a71a-6351-4860-a562-05df960a3f2c-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh\" (UID: \"c0e6a71a-6351-4860-a562-05df960a3f2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.582311 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhxhk\" (UniqueName: \"kubernetes.io/projected/c0e6a71a-6351-4860-a562-05df960a3f2c-kube-api-access-vhxhk\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh\" (UID: \"c0e6a71a-6351-4860-a562-05df960a3f2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.582397 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e6a71a-6351-4860-a562-05df960a3f2c-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh\" (UID: \"c0e6a71a-6351-4860-a562-05df960a3f2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.582426 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e6a71a-6351-4860-a562-05df960a3f2c-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh\" (UID: \"c0e6a71a-6351-4860-a562-05df960a3f2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.582946 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e6a71a-6351-4860-a562-05df960a3f2c-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh\" (UID: \"c0e6a71a-6351-4860-a562-05df960a3f2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.582988 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e6a71a-6351-4860-a562-05df960a3f2c-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh\" (UID: \"c0e6a71a-6351-4860-a562-05df960a3f2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.606013 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhxhk\" (UniqueName: \"kubernetes.io/projected/c0e6a71a-6351-4860-a562-05df960a3f2c-kube-api-access-vhxhk\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh\" (UID: \"c0e6a71a-6351-4860-a562-05df960a3f2c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:04 crc kubenswrapper[4628]: I1211 05:25:04.770174 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:05 crc kubenswrapper[4628]: I1211 05:25:05.025936 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh"] Dec 11 05:25:05 crc kubenswrapper[4628]: I1211 05:25:05.334588 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" event={"ID":"c0e6a71a-6351-4860-a562-05df960a3f2c","Type":"ContainerStarted","Data":"4c3746bfef875860323e45e2128884369815be3263a0320ad4782f8bf146e597"} Dec 11 05:25:05 crc kubenswrapper[4628]: I1211 05:25:05.334663 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" event={"ID":"c0e6a71a-6351-4860-a562-05df960a3f2c","Type":"ContainerStarted","Data":"fcbef5c3a720d3edcb8037218beb5ef94e1429322de1e81681ec68da1ac9cecf"} Dec 11 05:25:06 crc kubenswrapper[4628]: I1211 05:25:06.370410 4628 generic.go:334] "Generic (PLEG): container finished" podID="c0e6a71a-6351-4860-a562-05df960a3f2c" containerID="4c3746bfef875860323e45e2128884369815be3263a0320ad4782f8bf146e597" exitCode=0 Dec 11 05:25:06 crc kubenswrapper[4628]: I1211 05:25:06.370910 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" event={"ID":"c0e6a71a-6351-4860-a562-05df960a3f2c","Type":"ContainerDied","Data":"4c3746bfef875860323e45e2128884369815be3263a0320ad4782f8bf146e597"} Dec 11 05:25:08 crc kubenswrapper[4628]: I1211 05:25:08.386746 4628 generic.go:334] "Generic (PLEG): container finished" podID="c0e6a71a-6351-4860-a562-05df960a3f2c" containerID="83cd8d24e482666ef5929493a3d29baf5f203a69e5c1cda32ca65d3ad2328017" exitCode=0 Dec 11 05:25:08 crc kubenswrapper[4628]: I1211 05:25:08.386902 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" event={"ID":"c0e6a71a-6351-4860-a562-05df960a3f2c","Type":"ContainerDied","Data":"83cd8d24e482666ef5929493a3d29baf5f203a69e5c1cda32ca65d3ad2328017"} Dec 11 05:25:09 crc kubenswrapper[4628]: I1211 05:25:09.398769 4628 generic.go:334] "Generic (PLEG): container finished" podID="c0e6a71a-6351-4860-a562-05df960a3f2c" containerID="26cee70e65645d1e0bd212641fed9c624108f0755b1f76c8ccac926f5e91a247" exitCode=0 Dec 11 05:25:09 crc kubenswrapper[4628]: I1211 05:25:09.398828 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" event={"ID":"c0e6a71a-6351-4860-a562-05df960a3f2c","Type":"ContainerDied","Data":"26cee70e65645d1e0bd212641fed9c624108f0755b1f76c8ccac926f5e91a247"} Dec 11 05:25:10 crc kubenswrapper[4628]: I1211 05:25:10.648799 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:10 crc kubenswrapper[4628]: I1211 05:25:10.681397 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e6a71a-6351-4860-a562-05df960a3f2c-bundle\") pod \"c0e6a71a-6351-4860-a562-05df960a3f2c\" (UID: \"c0e6a71a-6351-4860-a562-05df960a3f2c\") " Dec 11 05:25:10 crc kubenswrapper[4628]: I1211 05:25:10.681442 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e6a71a-6351-4860-a562-05df960a3f2c-util\") pod \"c0e6a71a-6351-4860-a562-05df960a3f2c\" (UID: \"c0e6a71a-6351-4860-a562-05df960a3f2c\") " Dec 11 05:25:10 crc kubenswrapper[4628]: I1211 05:25:10.681476 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhxhk\" (UniqueName: \"kubernetes.io/projected/c0e6a71a-6351-4860-a562-05df960a3f2c-kube-api-access-vhxhk\") pod \"c0e6a71a-6351-4860-a562-05df960a3f2c\" (UID: \"c0e6a71a-6351-4860-a562-05df960a3f2c\") " Dec 11 05:25:10 crc kubenswrapper[4628]: I1211 05:25:10.682592 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e6a71a-6351-4860-a562-05df960a3f2c-bundle" (OuterVolumeSpecName: "bundle") pod "c0e6a71a-6351-4860-a562-05df960a3f2c" (UID: "c0e6a71a-6351-4860-a562-05df960a3f2c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:25:10 crc kubenswrapper[4628]: I1211 05:25:10.687051 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e6a71a-6351-4860-a562-05df960a3f2c-kube-api-access-vhxhk" (OuterVolumeSpecName: "kube-api-access-vhxhk") pod "c0e6a71a-6351-4860-a562-05df960a3f2c" (UID: "c0e6a71a-6351-4860-a562-05df960a3f2c"). InnerVolumeSpecName "kube-api-access-vhxhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:25:10 crc kubenswrapper[4628]: I1211 05:25:10.696098 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e6a71a-6351-4860-a562-05df960a3f2c-util" (OuterVolumeSpecName: "util") pod "c0e6a71a-6351-4860-a562-05df960a3f2c" (UID: "c0e6a71a-6351-4860-a562-05df960a3f2c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:25:10 crc kubenswrapper[4628]: I1211 05:25:10.782174 4628 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e6a71a-6351-4860-a562-05df960a3f2c-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:10 crc kubenswrapper[4628]: I1211 05:25:10.782480 4628 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e6a71a-6351-4860-a562-05df960a3f2c-util\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:10 crc kubenswrapper[4628]: I1211 05:25:10.782498 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhxhk\" (UniqueName: \"kubernetes.io/projected/c0e6a71a-6351-4860-a562-05df960a3f2c-kube-api-access-vhxhk\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:11 crc kubenswrapper[4628]: I1211 05:25:11.414121 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" event={"ID":"c0e6a71a-6351-4860-a562-05df960a3f2c","Type":"ContainerDied","Data":"fcbef5c3a720d3edcb8037218beb5ef94e1429322de1e81681ec68da1ac9cecf"} Dec 11 05:25:11 crc kubenswrapper[4628]: I1211 05:25:11.414160 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcbef5c3a720d3edcb8037218beb5ef94e1429322de1e81681ec68da1ac9cecf" Dec 11 05:25:11 crc kubenswrapper[4628]: I1211 05:25:11.414312 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.083678 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-589fs"] Dec 11 05:25:13 crc kubenswrapper[4628]: E1211 05:25:13.083990 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e6a71a-6351-4860-a562-05df960a3f2c" containerName="extract" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.084009 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e6a71a-6351-4860-a562-05df960a3f2c" containerName="extract" Dec 11 05:25:13 crc kubenswrapper[4628]: E1211 05:25:13.084025 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e6a71a-6351-4860-a562-05df960a3f2c" containerName="pull" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.084038 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e6a71a-6351-4860-a562-05df960a3f2c" containerName="pull" Dec 11 05:25:13 crc kubenswrapper[4628]: E1211 05:25:13.084059 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e6a71a-6351-4860-a562-05df960a3f2c" containerName="util" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.084075 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e6a71a-6351-4860-a562-05df960a3f2c" containerName="util" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.084263 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e6a71a-6351-4860-a562-05df960a3f2c" containerName="extract" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.084792 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-589fs" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.089701 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.089915 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.090098 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-x8mx9" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.092979 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-589fs"] Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.119740 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4pc7\" (UniqueName: \"kubernetes.io/projected/ae1d3899-1bda-4ad7-8512-9582b6fe2c54-kube-api-access-k4pc7\") pod \"nmstate-operator-5b5b58f5c8-589fs\" (UID: \"ae1d3899-1bda-4ad7-8512-9582b6fe2c54\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-589fs" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.221062 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4pc7\" (UniqueName: \"kubernetes.io/projected/ae1d3899-1bda-4ad7-8512-9582b6fe2c54-kube-api-access-k4pc7\") pod \"nmstate-operator-5b5b58f5c8-589fs\" (UID: \"ae1d3899-1bda-4ad7-8512-9582b6fe2c54\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-589fs" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.247837 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4pc7\" (UniqueName: \"kubernetes.io/projected/ae1d3899-1bda-4ad7-8512-9582b6fe2c54-kube-api-access-k4pc7\") pod \"nmstate-operator-5b5b58f5c8-589fs\" (UID: \"ae1d3899-1bda-4ad7-8512-9582b6fe2c54\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-589fs" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.432915 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-589fs" Dec 11 05:25:13 crc kubenswrapper[4628]: I1211 05:25:13.842557 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-589fs"] Dec 11 05:25:14 crc kubenswrapper[4628]: I1211 05:25:14.429691 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-589fs" event={"ID":"ae1d3899-1bda-4ad7-8512-9582b6fe2c54","Type":"ContainerStarted","Data":"9ea42fa5feaaf91c96d49228efdfa80085d8c7acb94804521061dbdd67a96d1a"} Dec 11 05:25:16 crc kubenswrapper[4628]: I1211 05:25:16.440242 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-589fs" event={"ID":"ae1d3899-1bda-4ad7-8512-9582b6fe2c54","Type":"ContainerStarted","Data":"59af3cb327986cc4b2c1af43b1ff304331a1d6e2863b3dc8a54220da490ca449"} Dec 11 05:25:16 crc kubenswrapper[4628]: I1211 05:25:16.460132 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-589fs" podStartSLOduration=1.035070722 podStartE2EDuration="3.460116534s" podCreationTimestamp="2025-12-11 05:25:13 +0000 UTC" firstStartedPulling="2025-12-11 05:25:13.848533874 +0000 UTC m=+616.265880572" lastFinishedPulling="2025-12-11 05:25:16.273579686 +0000 UTC m=+618.690926384" observedRunningTime="2025-12-11 05:25:16.456620566 +0000 UTC m=+618.873967264" watchObservedRunningTime="2025-12-11 05:25:16.460116534 +0000 UTC m=+618.877463232" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.396975 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-tqdqm"] Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.398227 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-tqdqm" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.401270 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-v5dt8" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.401556 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd"] Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.402202 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.405054 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.406721 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56dg4\" (UniqueName: \"kubernetes.io/projected/ddeebf78-e410-4562-a173-563b43b1b322-kube-api-access-56dg4\") pod \"nmstate-webhook-5f6d4c5ccb-c54rd\" (UID: \"ddeebf78-e410-4562-a173-563b43b1b322\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.406764 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ddeebf78-e410-4562-a173-563b43b1b322-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-c54rd\" (UID: \"ddeebf78-e410-4562-a173-563b43b1b322\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.406795 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbpjs\" (UniqueName: \"kubernetes.io/projected/3a77bda3-3bb4-402d-a4a9-df9e47e8ff39-kube-api-access-dbpjs\") pod \"nmstate-metrics-7f946cbc9-tqdqm\" (UID: \"3a77bda3-3bb4-402d-a4a9-df9e47e8ff39\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-tqdqm" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.413238 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-tqdqm"] Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.433570 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-lg7wb"] Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.434244 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.459750 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd"] Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.509436 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6f4s\" (UniqueName: \"kubernetes.io/projected/c2471a0c-a9c4-4323-9fb6-e67872046a7d-kube-api-access-b6f4s\") pod \"nmstate-handler-lg7wb\" (UID: \"c2471a0c-a9c4-4323-9fb6-e67872046a7d\") " pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.509497 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c2471a0c-a9c4-4323-9fb6-e67872046a7d-dbus-socket\") pod \"nmstate-handler-lg7wb\" (UID: \"c2471a0c-a9c4-4323-9fb6-e67872046a7d\") " pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.509542 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c2471a0c-a9c4-4323-9fb6-e67872046a7d-ovs-socket\") pod \"nmstate-handler-lg7wb\" (UID: \"c2471a0c-a9c4-4323-9fb6-e67872046a7d\") " pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.509596 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56dg4\" (UniqueName: \"kubernetes.io/projected/ddeebf78-e410-4562-a173-563b43b1b322-kube-api-access-56dg4\") pod \"nmstate-webhook-5f6d4c5ccb-c54rd\" (UID: \"ddeebf78-e410-4562-a173-563b43b1b322\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.510369 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ddeebf78-e410-4562-a173-563b43b1b322-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-c54rd\" (UID: \"ddeebf78-e410-4562-a173-563b43b1b322\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.510439 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbpjs\" (UniqueName: \"kubernetes.io/projected/3a77bda3-3bb4-402d-a4a9-df9e47e8ff39-kube-api-access-dbpjs\") pod \"nmstate-metrics-7f946cbc9-tqdqm\" (UID: \"3a77bda3-3bb4-402d-a4a9-df9e47e8ff39\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-tqdqm" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.510497 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c2471a0c-a9c4-4323-9fb6-e67872046a7d-nmstate-lock\") pod \"nmstate-handler-lg7wb\" (UID: \"c2471a0c-a9c4-4323-9fb6-e67872046a7d\") " pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: E1211 05:25:17.510610 4628 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 11 05:25:17 crc kubenswrapper[4628]: E1211 05:25:17.510665 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddeebf78-e410-4562-a173-563b43b1b322-tls-key-pair podName:ddeebf78-e410-4562-a173-563b43b1b322 nodeName:}" failed. No retries permitted until 2025-12-11 05:25:18.010647277 +0000 UTC m=+620.427993975 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/ddeebf78-e410-4562-a173-563b43b1b322-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-c54rd" (UID: "ddeebf78-e410-4562-a173-563b43b1b322") : secret "openshift-nmstate-webhook" not found Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.536640 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56dg4\" (UniqueName: \"kubernetes.io/projected/ddeebf78-e410-4562-a173-563b43b1b322-kube-api-access-56dg4\") pod \"nmstate-webhook-5f6d4c5ccb-c54rd\" (UID: \"ddeebf78-e410-4562-a173-563b43b1b322\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.537076 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbpjs\" (UniqueName: \"kubernetes.io/projected/3a77bda3-3bb4-402d-a4a9-df9e47e8ff39-kube-api-access-dbpjs\") pod \"nmstate-metrics-7f946cbc9-tqdqm\" (UID: \"3a77bda3-3bb4-402d-a4a9-df9e47e8ff39\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-tqdqm" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.584614 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p"] Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.585229 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.592706 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.592858 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.592997 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mjv4f" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.605797 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p"] Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.611395 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc6f9\" (UniqueName: \"kubernetes.io/projected/6b801585-cb83-40f7-ab06-68951c4455c6-kube-api-access-dc6f9\") pod \"nmstate-console-plugin-7fbb5f6569-mpq2p\" (UID: \"6b801585-cb83-40f7-ab06-68951c4455c6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.611448 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c2471a0c-a9c4-4323-9fb6-e67872046a7d-dbus-socket\") pod \"nmstate-handler-lg7wb\" (UID: \"c2471a0c-a9c4-4323-9fb6-e67872046a7d\") " pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.611484 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c2471a0c-a9c4-4323-9fb6-e67872046a7d-ovs-socket\") pod \"nmstate-handler-lg7wb\" (UID: \"c2471a0c-a9c4-4323-9fb6-e67872046a7d\") " pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.611546 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6b801585-cb83-40f7-ab06-68951c4455c6-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-mpq2p\" (UID: \"6b801585-cb83-40f7-ab06-68951c4455c6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.611572 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c2471a0c-a9c4-4323-9fb6-e67872046a7d-nmstate-lock\") pod \"nmstate-handler-lg7wb\" (UID: \"c2471a0c-a9c4-4323-9fb6-e67872046a7d\") " pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.611598 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b801585-cb83-40f7-ab06-68951c4455c6-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-mpq2p\" (UID: \"6b801585-cb83-40f7-ab06-68951c4455c6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.611648 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6f4s\" (UniqueName: \"kubernetes.io/projected/c2471a0c-a9c4-4323-9fb6-e67872046a7d-kube-api-access-b6f4s\") pod \"nmstate-handler-lg7wb\" (UID: \"c2471a0c-a9c4-4323-9fb6-e67872046a7d\") " pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.612001 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c2471a0c-a9c4-4323-9fb6-e67872046a7d-nmstate-lock\") pod \"nmstate-handler-lg7wb\" (UID: \"c2471a0c-a9c4-4323-9fb6-e67872046a7d\") " pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.611838 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c2471a0c-a9c4-4323-9fb6-e67872046a7d-ovs-socket\") pod \"nmstate-handler-lg7wb\" (UID: \"c2471a0c-a9c4-4323-9fb6-e67872046a7d\") " pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.612250 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c2471a0c-a9c4-4323-9fb6-e67872046a7d-dbus-socket\") pod \"nmstate-handler-lg7wb\" (UID: \"c2471a0c-a9c4-4323-9fb6-e67872046a7d\") " pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.647308 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6f4s\" (UniqueName: \"kubernetes.io/projected/c2471a0c-a9c4-4323-9fb6-e67872046a7d-kube-api-access-b6f4s\") pod \"nmstate-handler-lg7wb\" (UID: \"c2471a0c-a9c4-4323-9fb6-e67872046a7d\") " pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.713078 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6b801585-cb83-40f7-ab06-68951c4455c6-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-mpq2p\" (UID: \"6b801585-cb83-40f7-ab06-68951c4455c6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.713124 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b801585-cb83-40f7-ab06-68951c4455c6-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-mpq2p\" (UID: \"6b801585-cb83-40f7-ab06-68951c4455c6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.713171 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc6f9\" (UniqueName: \"kubernetes.io/projected/6b801585-cb83-40f7-ab06-68951c4455c6-kube-api-access-dc6f9\") pod \"nmstate-console-plugin-7fbb5f6569-mpq2p\" (UID: \"6b801585-cb83-40f7-ab06-68951c4455c6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.714287 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6b801585-cb83-40f7-ab06-68951c4455c6-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-mpq2p\" (UID: \"6b801585-cb83-40f7-ab06-68951c4455c6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" Dec 11 05:25:17 crc kubenswrapper[4628]: E1211 05:25:17.714357 4628 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 11 05:25:17 crc kubenswrapper[4628]: E1211 05:25:17.714393 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b801585-cb83-40f7-ab06-68951c4455c6-plugin-serving-cert podName:6b801585-cb83-40f7-ab06-68951c4455c6 nodeName:}" failed. No retries permitted until 2025-12-11 05:25:18.21438175 +0000 UTC m=+620.631728448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/6b801585-cb83-40f7-ab06-68951c4455c6-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-mpq2p" (UID: "6b801585-cb83-40f7-ab06-68951c4455c6") : secret "plugin-serving-cert" not found Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.718966 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-tqdqm" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.733147 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc6f9\" (UniqueName: \"kubernetes.io/projected/6b801585-cb83-40f7-ab06-68951c4455c6-kube-api-access-dc6f9\") pod \"nmstate-console-plugin-7fbb5f6569-mpq2p\" (UID: \"6b801585-cb83-40f7-ab06-68951c4455c6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.760112 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.787640 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bb9b88c9c-2x8r9"] Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.788407 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.815264 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/37603459-4771-4a35-b1c0-0af7ad48cf41-oauth-serving-cert\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.815668 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/37603459-4771-4a35-b1c0-0af7ad48cf41-console-serving-cert\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.815740 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/37603459-4771-4a35-b1c0-0af7ad48cf41-console-oauth-config\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.815835 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/37603459-4771-4a35-b1c0-0af7ad48cf41-service-ca\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.815937 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/37603459-4771-4a35-b1c0-0af7ad48cf41-console-config\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.816009 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37603459-4771-4a35-b1c0-0af7ad48cf41-trusted-ca-bundle\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.816068 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dchnm\" (UniqueName: \"kubernetes.io/projected/37603459-4771-4a35-b1c0-0af7ad48cf41-kube-api-access-dchnm\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.816238 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bb9b88c9c-2x8r9"] Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.925253 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/37603459-4771-4a35-b1c0-0af7ad48cf41-oauth-serving-cert\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.925306 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/37603459-4771-4a35-b1c0-0af7ad48cf41-console-serving-cert\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.925324 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/37603459-4771-4a35-b1c0-0af7ad48cf41-console-oauth-config\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.925370 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/37603459-4771-4a35-b1c0-0af7ad48cf41-service-ca\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.925388 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/37603459-4771-4a35-b1c0-0af7ad48cf41-console-config\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.925407 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37603459-4771-4a35-b1c0-0af7ad48cf41-trusted-ca-bundle\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.925422 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dchnm\" (UniqueName: \"kubernetes.io/projected/37603459-4771-4a35-b1c0-0af7ad48cf41-kube-api-access-dchnm\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.926440 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/37603459-4771-4a35-b1c0-0af7ad48cf41-oauth-serving-cert\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.928000 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/37603459-4771-4a35-b1c0-0af7ad48cf41-service-ca\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.929374 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/37603459-4771-4a35-b1c0-0af7ad48cf41-console-config\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.930058 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37603459-4771-4a35-b1c0-0af7ad48cf41-trusted-ca-bundle\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.932385 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/37603459-4771-4a35-b1c0-0af7ad48cf41-console-serving-cert\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.932405 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/37603459-4771-4a35-b1c0-0af7ad48cf41-console-oauth-config\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:17 crc kubenswrapper[4628]: I1211 05:25:17.948869 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dchnm\" (UniqueName: \"kubernetes.io/projected/37603459-4771-4a35-b1c0-0af7ad48cf41-kube-api-access-dchnm\") pod \"console-7bb9b88c9c-2x8r9\" (UID: \"37603459-4771-4a35-b1c0-0af7ad48cf41\") " pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.026512 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ddeebf78-e410-4562-a173-563b43b1b322-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-c54rd\" (UID: \"ddeebf78-e410-4562-a173-563b43b1b322\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.028321 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-tqdqm"] Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.029188 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ddeebf78-e410-4562-a173-563b43b1b322-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-c54rd\" (UID: \"ddeebf78-e410-4562-a173-563b43b1b322\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.109429 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.231613 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b801585-cb83-40f7-ab06-68951c4455c6-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-mpq2p\" (UID: \"6b801585-cb83-40f7-ab06-68951c4455c6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.234899 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b801585-cb83-40f7-ab06-68951c4455c6-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-mpq2p\" (UID: \"6b801585-cb83-40f7-ab06-68951c4455c6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.312197 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bb9b88c9c-2x8r9"] Dec 11 05:25:18 crc kubenswrapper[4628]: W1211 05:25:18.317700 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37603459_4771_4a35_b1c0_0af7ad48cf41.slice/crio-4eda74053facc111c2322f720d77f420f835b95d734926bbdf6b623e656209a2 WatchSource:0}: Error finding container 4eda74053facc111c2322f720d77f420f835b95d734926bbdf6b623e656209a2: Status 404 returned error can't find the container with id 4eda74053facc111c2322f720d77f420f835b95d734926bbdf6b623e656209a2 Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.328259 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.454291 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-tqdqm" event={"ID":"3a77bda3-3bb4-402d-a4a9-df9e47e8ff39","Type":"ContainerStarted","Data":"5d894cc96cb669127edbee4c6fddb6794c1734d27c8c1d003ed06a8041bf2a3e"} Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.455764 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bb9b88c9c-2x8r9" event={"ID":"37603459-4771-4a35-b1c0-0af7ad48cf41","Type":"ContainerStarted","Data":"4eda74053facc111c2322f720d77f420f835b95d734926bbdf6b623e656209a2"} Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.456614 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lg7wb" event={"ID":"c2471a0c-a9c4-4323-9fb6-e67872046a7d","Type":"ContainerStarted","Data":"77166e87a2d1fd56815ebdbb8291cdf6041dcc58baa5f8fe28d7963a2b8f5f03"} Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.500519 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.547140 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd"] Dec 11 05:25:18 crc kubenswrapper[4628]: W1211 05:25:18.565787 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddeebf78_e410_4562_a173_563b43b1b322.slice/crio-b296b051128fec6d1266a7c97f6ec1e5541e02f47137d5eb7b90792e47e85abb WatchSource:0}: Error finding container b296b051128fec6d1266a7c97f6ec1e5541e02f47137d5eb7b90792e47e85abb: Status 404 returned error can't find the container with id b296b051128fec6d1266a7c97f6ec1e5541e02f47137d5eb7b90792e47e85abb Dec 11 05:25:18 crc kubenswrapper[4628]: I1211 05:25:18.705108 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p"] Dec 11 05:25:19 crc kubenswrapper[4628]: I1211 05:25:19.464203 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bb9b88c9c-2x8r9" event={"ID":"37603459-4771-4a35-b1c0-0af7ad48cf41","Type":"ContainerStarted","Data":"2623976c9acd7fdb7e732404b8c0613250ae9f4a661ee638a23d1a74884bdbb1"} Dec 11 05:25:19 crc kubenswrapper[4628]: I1211 05:25:19.465091 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" event={"ID":"ddeebf78-e410-4562-a173-563b43b1b322","Type":"ContainerStarted","Data":"b296b051128fec6d1266a7c97f6ec1e5541e02f47137d5eb7b90792e47e85abb"} Dec 11 05:25:19 crc kubenswrapper[4628]: I1211 05:25:19.467078 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" event={"ID":"6b801585-cb83-40f7-ab06-68951c4455c6","Type":"ContainerStarted","Data":"a813090d89541f151c38babe0324e480a814afce69d607e9223c1b8724eba87f"} Dec 11 05:25:19 crc kubenswrapper[4628]: I1211 05:25:19.487596 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bb9b88c9c-2x8r9" podStartSLOduration=2.487552953 podStartE2EDuration="2.487552953s" podCreationTimestamp="2025-12-11 05:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:25:19.481482185 +0000 UTC m=+621.898828883" watchObservedRunningTime="2025-12-11 05:25:19.487552953 +0000 UTC m=+621.904899651" Dec 11 05:25:21 crc kubenswrapper[4628]: I1211 05:25:21.479131 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-lg7wb" event={"ID":"c2471a0c-a9c4-4323-9fb6-e67872046a7d","Type":"ContainerStarted","Data":"26567e8e4c4e7b75b13269c299d38819cbb8418c725275c14e57c960b2aef57b"} Dec 11 05:25:21 crc kubenswrapper[4628]: I1211 05:25:21.480455 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:21 crc kubenswrapper[4628]: I1211 05:25:21.482243 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" event={"ID":"ddeebf78-e410-4562-a173-563b43b1b322","Type":"ContainerStarted","Data":"8b9473fd6daabbb4f1a296a2f6702c1367cf08d5d010a4e2d3107144ef8c43e6"} Dec 11 05:25:21 crc kubenswrapper[4628]: I1211 05:25:21.482374 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" Dec 11 05:25:21 crc kubenswrapper[4628]: I1211 05:25:21.489047 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-tqdqm" event={"ID":"3a77bda3-3bb4-402d-a4a9-df9e47e8ff39","Type":"ContainerStarted","Data":"1436f8e4dcaecf218ba53775067f336dbd7ab722698be1f9b80b7ee2b1389782"} Dec 11 05:25:21 crc kubenswrapper[4628]: I1211 05:25:21.498707 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-lg7wb" podStartSLOduration=1.803860273 podStartE2EDuration="4.498685338s" podCreationTimestamp="2025-12-11 05:25:17 +0000 UTC" firstStartedPulling="2025-12-11 05:25:17.807050878 +0000 UTC m=+620.224397576" lastFinishedPulling="2025-12-11 05:25:20.501875933 +0000 UTC m=+622.919222641" observedRunningTime="2025-12-11 05:25:21.494087701 +0000 UTC m=+623.911434459" watchObservedRunningTime="2025-12-11 05:25:21.498685338 +0000 UTC m=+623.916032026" Dec 11 05:25:21 crc kubenswrapper[4628]: I1211 05:25:21.509312 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" podStartSLOduration=2.557178172 podStartE2EDuration="4.509281172s" podCreationTimestamp="2025-12-11 05:25:17 +0000 UTC" firstStartedPulling="2025-12-11 05:25:18.568509102 +0000 UTC m=+620.985855800" lastFinishedPulling="2025-12-11 05:25:20.520612102 +0000 UTC m=+622.937958800" observedRunningTime="2025-12-11 05:25:21.508274804 +0000 UTC m=+623.925621542" watchObservedRunningTime="2025-12-11 05:25:21.509281172 +0000 UTC m=+623.926627910" Dec 11 05:25:22 crc kubenswrapper[4628]: I1211 05:25:22.495392 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" event={"ID":"6b801585-cb83-40f7-ab06-68951c4455c6","Type":"ContainerStarted","Data":"78d1fd9fb7624c49c18ccb7889876b2cc189420fac3a8d5422dfd057a1af720f"} Dec 11 05:25:22 crc kubenswrapper[4628]: I1211 05:25:22.516082 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-mpq2p" podStartSLOduration=2.700144983 podStartE2EDuration="5.516065923s" podCreationTimestamp="2025-12-11 05:25:17 +0000 UTC" firstStartedPulling="2025-12-11 05:25:18.714039414 +0000 UTC m=+621.131386112" lastFinishedPulling="2025-12-11 05:25:21.529960354 +0000 UTC m=+623.947307052" observedRunningTime="2025-12-11 05:25:22.515779575 +0000 UTC m=+624.933126273" watchObservedRunningTime="2025-12-11 05:25:22.516065923 +0000 UTC m=+624.933412621" Dec 11 05:25:23 crc kubenswrapper[4628]: I1211 05:25:23.505765 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-tqdqm" event={"ID":"3a77bda3-3bb4-402d-a4a9-df9e47e8ff39","Type":"ContainerStarted","Data":"e8450d8c2487d00ce7d496d6974359825aefc08524238a8b82b65f5c34ce865c"} Dec 11 05:25:23 crc kubenswrapper[4628]: I1211 05:25:23.537025 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-tqdqm" podStartSLOduration=1.763270098 podStartE2EDuration="6.537009587s" podCreationTimestamp="2025-12-11 05:25:17 +0000 UTC" firstStartedPulling="2025-12-11 05:25:18.037394839 +0000 UTC m=+620.454741537" lastFinishedPulling="2025-12-11 05:25:22.811134328 +0000 UTC m=+625.228481026" observedRunningTime="2025-12-11 05:25:23.529831358 +0000 UTC m=+625.947178066" watchObservedRunningTime="2025-12-11 05:25:23.537009587 +0000 UTC m=+625.954356295" Dec 11 05:25:27 crc kubenswrapper[4628]: I1211 05:25:27.799025 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-lg7wb" Dec 11 05:25:28 crc kubenswrapper[4628]: I1211 05:25:28.110258 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:28 crc kubenswrapper[4628]: I1211 05:25:28.110353 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:28 crc kubenswrapper[4628]: I1211 05:25:28.116957 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:28 crc kubenswrapper[4628]: I1211 05:25:28.545645 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bb9b88c9c-2x8r9" Dec 11 05:25:28 crc kubenswrapper[4628]: I1211 05:25:28.638557 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4nw5h"] Dec 11 05:25:38 crc kubenswrapper[4628]: I1211 05:25:38.338818 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-c54rd" Dec 11 05:25:51 crc kubenswrapper[4628]: I1211 05:25:51.964438 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx"] Dec 11 05:25:51 crc kubenswrapper[4628]: I1211 05:25:51.967107 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:25:51 crc kubenswrapper[4628]: I1211 05:25:51.970171 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 11 05:25:51 crc kubenswrapper[4628]: I1211 05:25:51.979641 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx"] Dec 11 05:25:52 crc kubenswrapper[4628]: I1211 05:25:52.034031 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssdqd\" (UniqueName: \"kubernetes.io/projected/861570ba-65cf-4e91-90c0-c26b0c452c0e-kube-api-access-ssdqd\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx\" (UID: \"861570ba-65cf-4e91-90c0-c26b0c452c0e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:25:52 crc kubenswrapper[4628]: I1211 05:25:52.034099 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/861570ba-65cf-4e91-90c0-c26b0c452c0e-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx\" (UID: \"861570ba-65cf-4e91-90c0-c26b0c452c0e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:25:52 crc kubenswrapper[4628]: I1211 05:25:52.034438 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/861570ba-65cf-4e91-90c0-c26b0c452c0e-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx\" (UID: \"861570ba-65cf-4e91-90c0-c26b0c452c0e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:25:52 crc kubenswrapper[4628]: I1211 05:25:52.135577 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssdqd\" (UniqueName: \"kubernetes.io/projected/861570ba-65cf-4e91-90c0-c26b0c452c0e-kube-api-access-ssdqd\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx\" (UID: \"861570ba-65cf-4e91-90c0-c26b0c452c0e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:25:52 crc kubenswrapper[4628]: I1211 05:25:52.135716 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/861570ba-65cf-4e91-90c0-c26b0c452c0e-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx\" (UID: \"861570ba-65cf-4e91-90c0-c26b0c452c0e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:25:52 crc kubenswrapper[4628]: I1211 05:25:52.136688 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/861570ba-65cf-4e91-90c0-c26b0c452c0e-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx\" (UID: \"861570ba-65cf-4e91-90c0-c26b0c452c0e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:25:52 crc kubenswrapper[4628]: I1211 05:25:52.137213 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/861570ba-65cf-4e91-90c0-c26b0c452c0e-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx\" (UID: \"861570ba-65cf-4e91-90c0-c26b0c452c0e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:25:52 crc kubenswrapper[4628]: I1211 05:25:52.137804 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/861570ba-65cf-4e91-90c0-c26b0c452c0e-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx\" (UID: \"861570ba-65cf-4e91-90c0-c26b0c452c0e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:25:52 crc kubenswrapper[4628]: I1211 05:25:52.163338 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssdqd\" (UniqueName: \"kubernetes.io/projected/861570ba-65cf-4e91-90c0-c26b0c452c0e-kube-api-access-ssdqd\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx\" (UID: \"861570ba-65cf-4e91-90c0-c26b0c452c0e\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:25:52 crc kubenswrapper[4628]: I1211 05:25:52.289479 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:25:52 crc kubenswrapper[4628]: I1211 05:25:52.733611 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx"] Dec 11 05:25:53 crc kubenswrapper[4628]: I1211 05:25:53.693362 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-4nw5h" podUID="5111b417-34a8-405f-a0b8-eab04e144ff8" containerName="console" containerID="cri-o://f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c" gracePeriod=15 Dec 11 05:25:53 crc kubenswrapper[4628]: I1211 05:25:53.731777 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" event={"ID":"861570ba-65cf-4e91-90c0-c26b0c452c0e","Type":"ContainerStarted","Data":"9b51d749ac3a5a38acc35eaf5735f1ba2c4e0df8b90079551d8f5834a44152b4"} Dec 11 05:25:53 crc kubenswrapper[4628]: I1211 05:25:53.731844 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" event={"ID":"861570ba-65cf-4e91-90c0-c26b0c452c0e","Type":"ContainerStarted","Data":"ed15de2d4afb83298d63187044c1aec3f2a657ae7e290753816439852757894a"} Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.388075 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4nw5h_5111b417-34a8-405f-a0b8-eab04e144ff8/console/0.log" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.388205 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.467597 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-oauth-serving-cert\") pod \"5111b417-34a8-405f-a0b8-eab04e144ff8\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.467637 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-service-ca\") pod \"5111b417-34a8-405f-a0b8-eab04e144ff8\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.468453 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5111b417-34a8-405f-a0b8-eab04e144ff8" (UID: "5111b417-34a8-405f-a0b8-eab04e144ff8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.468466 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-service-ca" (OuterVolumeSpecName: "service-ca") pod "5111b417-34a8-405f-a0b8-eab04e144ff8" (UID: "5111b417-34a8-405f-a0b8-eab04e144ff8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.468566 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-trusted-ca-bundle\") pod \"5111b417-34a8-405f-a0b8-eab04e144ff8\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.469110 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5111b417-34a8-405f-a0b8-eab04e144ff8" (UID: "5111b417-34a8-405f-a0b8-eab04e144ff8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.469150 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kf9p\" (UniqueName: \"kubernetes.io/projected/5111b417-34a8-405f-a0b8-eab04e144ff8-kube-api-access-7kf9p\") pod \"5111b417-34a8-405f-a0b8-eab04e144ff8\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.469212 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5111b417-34a8-405f-a0b8-eab04e144ff8-console-serving-cert\") pod \"5111b417-34a8-405f-a0b8-eab04e144ff8\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.469244 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-console-config\") pod \"5111b417-34a8-405f-a0b8-eab04e144ff8\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.469581 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5111b417-34a8-405f-a0b8-eab04e144ff8-console-oauth-config\") pod \"5111b417-34a8-405f-a0b8-eab04e144ff8\" (UID: \"5111b417-34a8-405f-a0b8-eab04e144ff8\") " Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.469796 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-console-config" (OuterVolumeSpecName: "console-config") pod "5111b417-34a8-405f-a0b8-eab04e144ff8" (UID: "5111b417-34a8-405f-a0b8-eab04e144ff8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.470002 4628 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.470020 4628 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-console-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.470033 4628 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.470045 4628 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5111b417-34a8-405f-a0b8-eab04e144ff8-service-ca\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.473373 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5111b417-34a8-405f-a0b8-eab04e144ff8-kube-api-access-7kf9p" (OuterVolumeSpecName: "kube-api-access-7kf9p") pod "5111b417-34a8-405f-a0b8-eab04e144ff8" (UID: "5111b417-34a8-405f-a0b8-eab04e144ff8"). InnerVolumeSpecName "kube-api-access-7kf9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.473572 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5111b417-34a8-405f-a0b8-eab04e144ff8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5111b417-34a8-405f-a0b8-eab04e144ff8" (UID: "5111b417-34a8-405f-a0b8-eab04e144ff8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.474075 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5111b417-34a8-405f-a0b8-eab04e144ff8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5111b417-34a8-405f-a0b8-eab04e144ff8" (UID: "5111b417-34a8-405f-a0b8-eab04e144ff8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.571792 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kf9p\" (UniqueName: \"kubernetes.io/projected/5111b417-34a8-405f-a0b8-eab04e144ff8-kube-api-access-7kf9p\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.571833 4628 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5111b417-34a8-405f-a0b8-eab04e144ff8-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.571848 4628 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5111b417-34a8-405f-a0b8-eab04e144ff8-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.760802 4628 generic.go:334] "Generic (PLEG): container finished" podID="861570ba-65cf-4e91-90c0-c26b0c452c0e" containerID="9b51d749ac3a5a38acc35eaf5735f1ba2c4e0df8b90079551d8f5834a44152b4" exitCode=0 Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.762971 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" event={"ID":"861570ba-65cf-4e91-90c0-c26b0c452c0e","Type":"ContainerDied","Data":"9b51d749ac3a5a38acc35eaf5735f1ba2c4e0df8b90079551d8f5834a44152b4"} Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.763791 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4nw5h_5111b417-34a8-405f-a0b8-eab04e144ff8/console/0.log" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.763865 4628 generic.go:334] "Generic (PLEG): container finished" podID="5111b417-34a8-405f-a0b8-eab04e144ff8" containerID="f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c" exitCode=2 Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.763935 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4nw5h" event={"ID":"5111b417-34a8-405f-a0b8-eab04e144ff8","Type":"ContainerDied","Data":"f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c"} Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.763965 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4nw5h" event={"ID":"5111b417-34a8-405f-a0b8-eab04e144ff8","Type":"ContainerDied","Data":"333e7251ed6902bdf36ca2c25036a4cf58182e2132c20fe49bd99e863b186a17"} Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.763994 4628 scope.go:117] "RemoveContainer" containerID="f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.764010 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4nw5h" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.792526 4628 scope.go:117] "RemoveContainer" containerID="f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c" Dec 11 05:25:54 crc kubenswrapper[4628]: E1211 05:25:54.793440 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c\": container with ID starting with f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c not found: ID does not exist" containerID="f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.793530 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c"} err="failed to get container status \"f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c\": rpc error: code = NotFound desc = could not find container \"f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c\": container with ID starting with f50891023cfcf0af4d65d3ad86c94c09d92f91b6c0b5954b92f177e8b565045c not found: ID does not exist" Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.815239 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4nw5h"] Dec 11 05:25:54 crc kubenswrapper[4628]: I1211 05:25:54.820180 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-4nw5h"] Dec 11 05:25:55 crc kubenswrapper[4628]: I1211 05:25:55.899198 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5111b417-34a8-405f-a0b8-eab04e144ff8" path="/var/lib/kubelet/pods/5111b417-34a8-405f-a0b8-eab04e144ff8/volumes" Dec 11 05:25:56 crc kubenswrapper[4628]: I1211 05:25:56.779216 4628 generic.go:334] "Generic (PLEG): container finished" podID="861570ba-65cf-4e91-90c0-c26b0c452c0e" containerID="c485dfa9bd4dc55fa27dcbf5953e02162136084f0e3cf4fd8dd703d5cf1bf6ad" exitCode=0 Dec 11 05:25:56 crc kubenswrapper[4628]: I1211 05:25:56.779283 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" event={"ID":"861570ba-65cf-4e91-90c0-c26b0c452c0e","Type":"ContainerDied","Data":"c485dfa9bd4dc55fa27dcbf5953e02162136084f0e3cf4fd8dd703d5cf1bf6ad"} Dec 11 05:25:57 crc kubenswrapper[4628]: I1211 05:25:57.790244 4628 generic.go:334] "Generic (PLEG): container finished" podID="861570ba-65cf-4e91-90c0-c26b0c452c0e" containerID="3f1fedcf178107c2c6429374c7501041a852b203f01dd07cdd6f35976f891395" exitCode=0 Dec 11 05:25:57 crc kubenswrapper[4628]: I1211 05:25:57.791003 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" event={"ID":"861570ba-65cf-4e91-90c0-c26b0c452c0e","Type":"ContainerDied","Data":"3f1fedcf178107c2c6429374c7501041a852b203f01dd07cdd6f35976f891395"} Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.088199 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.139609 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssdqd\" (UniqueName: \"kubernetes.io/projected/861570ba-65cf-4e91-90c0-c26b0c452c0e-kube-api-access-ssdqd\") pod \"861570ba-65cf-4e91-90c0-c26b0c452c0e\" (UID: \"861570ba-65cf-4e91-90c0-c26b0c452c0e\") " Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.139662 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/861570ba-65cf-4e91-90c0-c26b0c452c0e-util\") pod \"861570ba-65cf-4e91-90c0-c26b0c452c0e\" (UID: \"861570ba-65cf-4e91-90c0-c26b0c452c0e\") " Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.139692 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/861570ba-65cf-4e91-90c0-c26b0c452c0e-bundle\") pod \"861570ba-65cf-4e91-90c0-c26b0c452c0e\" (UID: \"861570ba-65cf-4e91-90c0-c26b0c452c0e\") " Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.140999 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/861570ba-65cf-4e91-90c0-c26b0c452c0e-bundle" (OuterVolumeSpecName: "bundle") pod "861570ba-65cf-4e91-90c0-c26b0c452c0e" (UID: "861570ba-65cf-4e91-90c0-c26b0c452c0e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.146508 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861570ba-65cf-4e91-90c0-c26b0c452c0e-kube-api-access-ssdqd" (OuterVolumeSpecName: "kube-api-access-ssdqd") pod "861570ba-65cf-4e91-90c0-c26b0c452c0e" (UID: "861570ba-65cf-4e91-90c0-c26b0c452c0e"). InnerVolumeSpecName "kube-api-access-ssdqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.241019 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssdqd\" (UniqueName: \"kubernetes.io/projected/861570ba-65cf-4e91-90c0-c26b0c452c0e-kube-api-access-ssdqd\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.241054 4628 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/861570ba-65cf-4e91-90c0-c26b0c452c0e-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.670163 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/861570ba-65cf-4e91-90c0-c26b0c452c0e-util" (OuterVolumeSpecName: "util") pod "861570ba-65cf-4e91-90c0-c26b0c452c0e" (UID: "861570ba-65cf-4e91-90c0-c26b0c452c0e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.747056 4628 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/861570ba-65cf-4e91-90c0-c26b0c452c0e-util\") on node \"crc\" DevicePath \"\"" Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.809110 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" event={"ID":"861570ba-65cf-4e91-90c0-c26b0c452c0e","Type":"ContainerDied","Data":"ed15de2d4afb83298d63187044c1aec3f2a657ae7e290753816439852757894a"} Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.809151 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed15de2d4afb83298d63187044c1aec3f2a657ae7e290753816439852757894a" Dec 11 05:25:59 crc kubenswrapper[4628]: I1211 05:25:59.809311 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx" Dec 11 05:26:04 crc kubenswrapper[4628]: I1211 05:26:04.286303 4628 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.087147 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p"] Dec 11 05:26:10 crc kubenswrapper[4628]: E1211 05:26:10.087949 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5111b417-34a8-405f-a0b8-eab04e144ff8" containerName="console" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.087964 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="5111b417-34a8-405f-a0b8-eab04e144ff8" containerName="console" Dec 11 05:26:10 crc kubenswrapper[4628]: E1211 05:26:10.087982 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861570ba-65cf-4e91-90c0-c26b0c452c0e" containerName="util" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.087990 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="861570ba-65cf-4e91-90c0-c26b0c452c0e" containerName="util" Dec 11 05:26:10 crc kubenswrapper[4628]: E1211 05:26:10.088001 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861570ba-65cf-4e91-90c0-c26b0c452c0e" containerName="pull" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.088009 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="861570ba-65cf-4e91-90c0-c26b0c452c0e" containerName="pull" Dec 11 05:26:10 crc kubenswrapper[4628]: E1211 05:26:10.088028 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861570ba-65cf-4e91-90c0-c26b0c452c0e" containerName="extract" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.088035 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="861570ba-65cf-4e91-90c0-c26b0c452c0e" containerName="extract" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.088162 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="5111b417-34a8-405f-a0b8-eab04e144ff8" containerName="console" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.088173 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="861570ba-65cf-4e91-90c0-c26b0c452c0e" containerName="extract" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.088585 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.090179 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.090576 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.094814 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.095046 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.095735 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-c7gf2" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.118162 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p"] Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.196788 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4jdk\" (UniqueName: \"kubernetes.io/projected/8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e-kube-api-access-n4jdk\") pod \"metallb-operator-controller-manager-5f9c8b77b-f478p\" (UID: \"8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e\") " pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.196864 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e-apiservice-cert\") pod \"metallb-operator-controller-manager-5f9c8b77b-f478p\" (UID: \"8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e\") " pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.196897 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e-webhook-cert\") pod \"metallb-operator-controller-manager-5f9c8b77b-f478p\" (UID: \"8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e\") " pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.297976 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4jdk\" (UniqueName: \"kubernetes.io/projected/8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e-kube-api-access-n4jdk\") pod \"metallb-operator-controller-manager-5f9c8b77b-f478p\" (UID: \"8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e\") " pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.298023 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e-apiservice-cert\") pod \"metallb-operator-controller-manager-5f9c8b77b-f478p\" (UID: \"8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e\") " pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.298058 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e-webhook-cert\") pod \"metallb-operator-controller-manager-5f9c8b77b-f478p\" (UID: \"8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e\") " pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.305641 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e-apiservice-cert\") pod \"metallb-operator-controller-manager-5f9c8b77b-f478p\" (UID: \"8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e\") " pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.317578 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e-webhook-cert\") pod \"metallb-operator-controller-manager-5f9c8b77b-f478p\" (UID: \"8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e\") " pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.317764 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4jdk\" (UniqueName: \"kubernetes.io/projected/8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e-kube-api-access-n4jdk\") pod \"metallb-operator-controller-manager-5f9c8b77b-f478p\" (UID: \"8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e\") " pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.348655 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf"] Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.349384 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.354220 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.354283 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-c2gfm" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.354494 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.374835 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf"] Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.428574 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.530656 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f91be819-4cd2-4c94-98a2-108b05ab0a23-apiservice-cert\") pod \"metallb-operator-webhook-server-68cfc95d7c-4ssjf\" (UID: \"f91be819-4cd2-4c94-98a2-108b05ab0a23\") " pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.530976 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4pg2\" (UniqueName: \"kubernetes.io/projected/f91be819-4cd2-4c94-98a2-108b05ab0a23-kube-api-access-w4pg2\") pod \"metallb-operator-webhook-server-68cfc95d7c-4ssjf\" (UID: \"f91be819-4cd2-4c94-98a2-108b05ab0a23\") " pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.531000 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f91be819-4cd2-4c94-98a2-108b05ab0a23-webhook-cert\") pod \"metallb-operator-webhook-server-68cfc95d7c-4ssjf\" (UID: \"f91be819-4cd2-4c94-98a2-108b05ab0a23\") " pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.632712 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f91be819-4cd2-4c94-98a2-108b05ab0a23-webhook-cert\") pod \"metallb-operator-webhook-server-68cfc95d7c-4ssjf\" (UID: \"f91be819-4cd2-4c94-98a2-108b05ab0a23\") " pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.632821 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f91be819-4cd2-4c94-98a2-108b05ab0a23-apiservice-cert\") pod \"metallb-operator-webhook-server-68cfc95d7c-4ssjf\" (UID: \"f91be819-4cd2-4c94-98a2-108b05ab0a23\") " pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.632893 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4pg2\" (UniqueName: \"kubernetes.io/projected/f91be819-4cd2-4c94-98a2-108b05ab0a23-kube-api-access-w4pg2\") pod \"metallb-operator-webhook-server-68cfc95d7c-4ssjf\" (UID: \"f91be819-4cd2-4c94-98a2-108b05ab0a23\") " pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.643627 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f91be819-4cd2-4c94-98a2-108b05ab0a23-webhook-cert\") pod \"metallb-operator-webhook-server-68cfc95d7c-4ssjf\" (UID: \"f91be819-4cd2-4c94-98a2-108b05ab0a23\") " pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.644290 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f91be819-4cd2-4c94-98a2-108b05ab0a23-apiservice-cert\") pod \"metallb-operator-webhook-server-68cfc95d7c-4ssjf\" (UID: \"f91be819-4cd2-4c94-98a2-108b05ab0a23\") " pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.660557 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4pg2\" (UniqueName: \"kubernetes.io/projected/f91be819-4cd2-4c94-98a2-108b05ab0a23-kube-api-access-w4pg2\") pod \"metallb-operator-webhook-server-68cfc95d7c-4ssjf\" (UID: \"f91be819-4cd2-4c94-98a2-108b05ab0a23\") " pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.663613 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:10 crc kubenswrapper[4628]: I1211 05:26:10.932921 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p"] Dec 11 05:26:11 crc kubenswrapper[4628]: I1211 05:26:11.096706 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf"] Dec 11 05:26:11 crc kubenswrapper[4628]: W1211 05:26:11.100063 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf91be819_4cd2_4c94_98a2_108b05ab0a23.slice/crio-0dcbaf8fda6d78e711433698fd82619a6a6018ec656416e0611954320b6adb22 WatchSource:0}: Error finding container 0dcbaf8fda6d78e711433698fd82619a6a6018ec656416e0611954320b6adb22: Status 404 returned error can't find the container with id 0dcbaf8fda6d78e711433698fd82619a6a6018ec656416e0611954320b6adb22 Dec 11 05:26:11 crc kubenswrapper[4628]: I1211 05:26:11.880410 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" event={"ID":"8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e","Type":"ContainerStarted","Data":"f624e09f8071688471cd2a94e359d9f4e681730baf204b22a6b4340a3bc895d7"} Dec 11 05:26:11 crc kubenswrapper[4628]: I1211 05:26:11.881541 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" event={"ID":"f91be819-4cd2-4c94-98a2-108b05ab0a23","Type":"ContainerStarted","Data":"0dcbaf8fda6d78e711433698fd82619a6a6018ec656416e0611954320b6adb22"} Dec 11 05:26:14 crc kubenswrapper[4628]: I1211 05:26:14.938750 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" event={"ID":"8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e","Type":"ContainerStarted","Data":"00acbfe037e3579654a78e1f61cbdae501288c0f5bf5efe6211ff3c0209177fd"} Dec 11 05:26:14 crc kubenswrapper[4628]: I1211 05:26:14.939072 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:14 crc kubenswrapper[4628]: I1211 05:26:14.992042 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" podStartSLOduration=1.755901502 podStartE2EDuration="4.992022059s" podCreationTimestamp="2025-12-11 05:26:10 +0000 UTC" firstStartedPulling="2025-12-11 05:26:10.941525067 +0000 UTC m=+673.358871765" lastFinishedPulling="2025-12-11 05:26:14.177645624 +0000 UTC m=+676.594992322" observedRunningTime="2025-12-11 05:26:14.982835155 +0000 UTC m=+677.400181853" watchObservedRunningTime="2025-12-11 05:26:14.992022059 +0000 UTC m=+677.409368757" Dec 11 05:26:16 crc kubenswrapper[4628]: I1211 05:26:16.951192 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" event={"ID":"f91be819-4cd2-4c94-98a2-108b05ab0a23","Type":"ContainerStarted","Data":"930558013efc2d0a08249a43f530583fa5b48faf395d037e4155271d5637056f"} Dec 11 05:26:16 crc kubenswrapper[4628]: I1211 05:26:16.952151 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:30 crc kubenswrapper[4628]: I1211 05:26:30.673453 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" Dec 11 05:26:30 crc kubenswrapper[4628]: I1211 05:26:30.700351 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-68cfc95d7c-4ssjf" podStartSLOduration=15.619404252 podStartE2EDuration="20.700337218s" podCreationTimestamp="2025-12-11 05:26:10 +0000 UTC" firstStartedPulling="2025-12-11 05:26:11.103800487 +0000 UTC m=+673.521147175" lastFinishedPulling="2025-12-11 05:26:16.184733423 +0000 UTC m=+678.602080141" observedRunningTime="2025-12-11 05:26:16.994862241 +0000 UTC m=+679.412208949" watchObservedRunningTime="2025-12-11 05:26:30.700337218 +0000 UTC m=+693.117683916" Dec 11 05:26:50 crc kubenswrapper[4628]: I1211 05:26:50.432665 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5f9c8b77b-f478p" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.144919 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-97bqw"] Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.147752 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.148860 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs"] Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.149539 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.150545 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-nz92z" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.152640 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.152817 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.153654 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.170545 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs"] Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.262063 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-v2km9"] Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.263078 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-v2km9" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.265511 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.265605 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-tfmzq" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.270204 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.270208 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.291437 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-q4v8q"] Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.292257 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.298486 4628 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.302493 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-q4v8q"] Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.322740 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c9bb5b6-577c-4b87-af2e-445ca30f9732-metrics-certs\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.322996 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1c9bb5b6-577c-4b87-af2e-445ca30f9732-metrics\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.323059 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1c9bb5b6-577c-4b87-af2e-445ca30f9732-reloader\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.323161 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abb0cb74-4b39-47f1-9a3a-cae28b6c32f6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5n5bs\" (UID: \"abb0cb74-4b39-47f1-9a3a-cae28b6c32f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.323186 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnzpn\" (UniqueName: \"kubernetes.io/projected/1c9bb5b6-577c-4b87-af2e-445ca30f9732-kube-api-access-nnzpn\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.323222 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kfnr\" (UniqueName: \"kubernetes.io/projected/abb0cb74-4b39-47f1-9a3a-cae28b6c32f6-kube-api-access-4kfnr\") pod \"frr-k8s-webhook-server-7fcb986d4-5n5bs\" (UID: \"abb0cb74-4b39-47f1-9a3a-cae28b6c32f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.323258 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1c9bb5b6-577c-4b87-af2e-445ca30f9732-frr-startup\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.323296 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1c9bb5b6-577c-4b87-af2e-445ca30f9732-frr-conf\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.323358 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1c9bb5b6-577c-4b87-af2e-445ca30f9732-frr-sockets\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.424712 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abb0cb74-4b39-47f1-9a3a-cae28b6c32f6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5n5bs\" (UID: \"abb0cb74-4b39-47f1-9a3a-cae28b6c32f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.424758 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnzpn\" (UniqueName: \"kubernetes.io/projected/1c9bb5b6-577c-4b87-af2e-445ca30f9732-kube-api-access-nnzpn\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.424787 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cea36da-fd5c-416d-ab52-9500bc3fae0e-metrics-certs\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.424812 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kfnr\" (UniqueName: \"kubernetes.io/projected/abb0cb74-4b39-47f1-9a3a-cae28b6c32f6-kube-api-access-4kfnr\") pod \"frr-k8s-webhook-server-7fcb986d4-5n5bs\" (UID: \"abb0cb74-4b39-47f1-9a3a-cae28b6c32f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.424862 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qsdk\" (UniqueName: \"kubernetes.io/projected/7fbbee42-6c6f-4b6f-a8f4-a7acb4686612-kube-api-access-9qsdk\") pod \"controller-f8648f98b-q4v8q\" (UID: \"7fbbee42-6c6f-4b6f-a8f4-a7acb4686612\") " pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.424887 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1c9bb5b6-577c-4b87-af2e-445ca30f9732-frr-startup\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425150 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1c9bb5b6-577c-4b87-af2e-445ca30f9732-frr-conf\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425218 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0cea36da-fd5c-416d-ab52-9500bc3fae0e-memberlist\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425237 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9j4m\" (UniqueName: \"kubernetes.io/projected/0cea36da-fd5c-416d-ab52-9500bc3fae0e-kube-api-access-n9j4m\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425256 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fbbee42-6c6f-4b6f-a8f4-a7acb4686612-cert\") pod \"controller-f8648f98b-q4v8q\" (UID: \"7fbbee42-6c6f-4b6f-a8f4-a7acb4686612\") " pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425272 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1c9bb5b6-577c-4b87-af2e-445ca30f9732-frr-sockets\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425303 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c9bb5b6-577c-4b87-af2e-445ca30f9732-metrics-certs\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425322 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1c9bb5b6-577c-4b87-af2e-445ca30f9732-metrics\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425337 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0cea36da-fd5c-416d-ab52-9500bc3fae0e-metallb-excludel2\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425351 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1c9bb5b6-577c-4b87-af2e-445ca30f9732-reloader\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425385 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fbbee42-6c6f-4b6f-a8f4-a7acb4686612-metrics-certs\") pod \"controller-f8648f98b-q4v8q\" (UID: \"7fbbee42-6c6f-4b6f-a8f4-a7acb4686612\") " pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425550 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1c9bb5b6-577c-4b87-af2e-445ca30f9732-frr-conf\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425725 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1c9bb5b6-577c-4b87-af2e-445ca30f9732-frr-sockets\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425769 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1c9bb5b6-577c-4b87-af2e-445ca30f9732-metrics\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.425817 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1c9bb5b6-577c-4b87-af2e-445ca30f9732-frr-startup\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.426002 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1c9bb5b6-577c-4b87-af2e-445ca30f9732-reloader\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.439597 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abb0cb74-4b39-47f1-9a3a-cae28b6c32f6-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-5n5bs\" (UID: \"abb0cb74-4b39-47f1-9a3a-cae28b6c32f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.443046 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnzpn\" (UniqueName: \"kubernetes.io/projected/1c9bb5b6-577c-4b87-af2e-445ca30f9732-kube-api-access-nnzpn\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.444480 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1c9bb5b6-577c-4b87-af2e-445ca30f9732-metrics-certs\") pod \"frr-k8s-97bqw\" (UID: \"1c9bb5b6-577c-4b87-af2e-445ca30f9732\") " pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.446689 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kfnr\" (UniqueName: \"kubernetes.io/projected/abb0cb74-4b39-47f1-9a3a-cae28b6c32f6-kube-api-access-4kfnr\") pod \"frr-k8s-webhook-server-7fcb986d4-5n5bs\" (UID: \"abb0cb74-4b39-47f1-9a3a-cae28b6c32f6\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.466064 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-97bqw" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.470551 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.528109 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0cea36da-fd5c-416d-ab52-9500bc3fae0e-memberlist\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.528150 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9j4m\" (UniqueName: \"kubernetes.io/projected/0cea36da-fd5c-416d-ab52-9500bc3fae0e-kube-api-access-n9j4m\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.528174 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fbbee42-6c6f-4b6f-a8f4-a7acb4686612-cert\") pod \"controller-f8648f98b-q4v8q\" (UID: \"7fbbee42-6c6f-4b6f-a8f4-a7acb4686612\") " pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.528220 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0cea36da-fd5c-416d-ab52-9500bc3fae0e-metallb-excludel2\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.528255 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fbbee42-6c6f-4b6f-a8f4-a7acb4686612-metrics-certs\") pod \"controller-f8648f98b-q4v8q\" (UID: \"7fbbee42-6c6f-4b6f-a8f4-a7acb4686612\") " pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.528282 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cea36da-fd5c-416d-ab52-9500bc3fae0e-metrics-certs\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.528299 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qsdk\" (UniqueName: \"kubernetes.io/projected/7fbbee42-6c6f-4b6f-a8f4-a7acb4686612-kube-api-access-9qsdk\") pod \"controller-f8648f98b-q4v8q\" (UID: \"7fbbee42-6c6f-4b6f-a8f4-a7acb4686612\") " pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:26:51 crc kubenswrapper[4628]: E1211 05:26:51.528369 4628 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.529700 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0cea36da-fd5c-416d-ab52-9500bc3fae0e-metallb-excludel2\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:51 crc kubenswrapper[4628]: E1211 05:26:51.530347 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cea36da-fd5c-416d-ab52-9500bc3fae0e-memberlist podName:0cea36da-fd5c-416d-ab52-9500bc3fae0e nodeName:}" failed. No retries permitted until 2025-12-11 05:26:52.028460536 +0000 UTC m=+714.445807234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0cea36da-fd5c-416d-ab52-9500bc3fae0e-memberlist") pod "speaker-v2km9" (UID: "0cea36da-fd5c-416d-ab52-9500bc3fae0e") : secret "metallb-memberlist" not found Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.532974 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fbbee42-6c6f-4b6f-a8f4-a7acb4686612-cert\") pod \"controller-f8648f98b-q4v8q\" (UID: \"7fbbee42-6c6f-4b6f-a8f4-a7acb4686612\") " pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.533919 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7fbbee42-6c6f-4b6f-a8f4-a7acb4686612-metrics-certs\") pod \"controller-f8648f98b-q4v8q\" (UID: \"7fbbee42-6c6f-4b6f-a8f4-a7acb4686612\") " pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.534209 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0cea36da-fd5c-416d-ab52-9500bc3fae0e-metrics-certs\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.549603 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qsdk\" (UniqueName: \"kubernetes.io/projected/7fbbee42-6c6f-4b6f-a8f4-a7acb4686612-kube-api-access-9qsdk\") pod \"controller-f8648f98b-q4v8q\" (UID: \"7fbbee42-6c6f-4b6f-a8f4-a7acb4686612\") " pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.551261 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9j4m\" (UniqueName: \"kubernetes.io/projected/0cea36da-fd5c-416d-ab52-9500bc3fae0e-kube-api-access-n9j4m\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.610933 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:26:51 crc kubenswrapper[4628]: I1211 05:26:51.746428 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs"] Dec 11 05:26:51 crc kubenswrapper[4628]: W1211 05:26:51.761711 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabb0cb74_4b39_47f1_9a3a_cae28b6c32f6.slice/crio-494a8e60b94189e03e45cccfd5e2b7c3936097d728694855de51753d8b9bd1b4 WatchSource:0}: Error finding container 494a8e60b94189e03e45cccfd5e2b7c3936097d728694855de51753d8b9bd1b4: Status 404 returned error can't find the container with id 494a8e60b94189e03e45cccfd5e2b7c3936097d728694855de51753d8b9bd1b4 Dec 11 05:26:52 crc kubenswrapper[4628]: I1211 05:26:52.033108 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0cea36da-fd5c-416d-ab52-9500bc3fae0e-memberlist\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:52 crc kubenswrapper[4628]: E1211 05:26:52.034024 4628 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 11 05:26:52 crc kubenswrapper[4628]: E1211 05:26:52.034073 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cea36da-fd5c-416d-ab52-9500bc3fae0e-memberlist podName:0cea36da-fd5c-416d-ab52-9500bc3fae0e nodeName:}" failed. No retries permitted until 2025-12-11 05:26:53.034057366 +0000 UTC m=+715.451404064 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/0cea36da-fd5c-416d-ab52-9500bc3fae0e-memberlist") pod "speaker-v2km9" (UID: "0cea36da-fd5c-416d-ab52-9500bc3fae0e") : secret "metallb-memberlist" not found Dec 11 05:26:52 crc kubenswrapper[4628]: I1211 05:26:52.061566 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-q4v8q"] Dec 11 05:26:52 crc kubenswrapper[4628]: W1211 05:26:52.068013 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fbbee42_6c6f_4b6f_a8f4_a7acb4686612.slice/crio-7d7f94a25adbc1a1ccefaa84dbd084429feb68fd8da519cbd0157d20cfb41909 WatchSource:0}: Error finding container 7d7f94a25adbc1a1ccefaa84dbd084429feb68fd8da519cbd0157d20cfb41909: Status 404 returned error can't find the container with id 7d7f94a25adbc1a1ccefaa84dbd084429feb68fd8da519cbd0157d20cfb41909 Dec 11 05:26:52 crc kubenswrapper[4628]: I1211 05:26:52.154116 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" event={"ID":"abb0cb74-4b39-47f1-9a3a-cae28b6c32f6","Type":"ContainerStarted","Data":"494a8e60b94189e03e45cccfd5e2b7c3936097d728694855de51753d8b9bd1b4"} Dec 11 05:26:52 crc kubenswrapper[4628]: I1211 05:26:52.154776 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-97bqw" event={"ID":"1c9bb5b6-577c-4b87-af2e-445ca30f9732","Type":"ContainerStarted","Data":"20362f8e468faba6017bfcf2cba3d17502210770db453dbaedd00ecd208c1e85"} Dec 11 05:26:52 crc kubenswrapper[4628]: I1211 05:26:52.155413 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-q4v8q" event={"ID":"7fbbee42-6c6f-4b6f-a8f4-a7acb4686612","Type":"ContainerStarted","Data":"7d7f94a25adbc1a1ccefaa84dbd084429feb68fd8da519cbd0157d20cfb41909"} Dec 11 05:26:53 crc kubenswrapper[4628]: I1211 05:26:53.048747 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0cea36da-fd5c-416d-ab52-9500bc3fae0e-memberlist\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:53 crc kubenswrapper[4628]: I1211 05:26:53.075962 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0cea36da-fd5c-416d-ab52-9500bc3fae0e-memberlist\") pod \"speaker-v2km9\" (UID: \"0cea36da-fd5c-416d-ab52-9500bc3fae0e\") " pod="metallb-system/speaker-v2km9" Dec 11 05:26:53 crc kubenswrapper[4628]: I1211 05:26:53.076731 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-v2km9" Dec 11 05:26:53 crc kubenswrapper[4628]: I1211 05:26:53.165969 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-v2km9" event={"ID":"0cea36da-fd5c-416d-ab52-9500bc3fae0e","Type":"ContainerStarted","Data":"e463528ed44b1a1d4a803ddd5f981083b11cd5363d07bf3020073333c74d3be9"} Dec 11 05:26:53 crc kubenswrapper[4628]: I1211 05:26:53.168455 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-q4v8q" event={"ID":"7fbbee42-6c6f-4b6f-a8f4-a7acb4686612","Type":"ContainerStarted","Data":"3c8a982382ec00fe9dedb361491b9878140141c9526c8e71f504ef002051f1c6"} Dec 11 05:26:53 crc kubenswrapper[4628]: I1211 05:26:53.168503 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-q4v8q" event={"ID":"7fbbee42-6c6f-4b6f-a8f4-a7acb4686612","Type":"ContainerStarted","Data":"d2b6debe80841bfd958af4d2f00c8d84921ac0fd1c53de12348f2136f4d8214f"} Dec 11 05:26:53 crc kubenswrapper[4628]: I1211 05:26:53.168603 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:26:53 crc kubenswrapper[4628]: I1211 05:26:53.188873 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-q4v8q" podStartSLOduration=2.188856771 podStartE2EDuration="2.188856771s" podCreationTimestamp="2025-12-11 05:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:26:53.18701877 +0000 UTC m=+715.604365468" watchObservedRunningTime="2025-12-11 05:26:53.188856771 +0000 UTC m=+715.606203469" Dec 11 05:26:54 crc kubenswrapper[4628]: I1211 05:26:54.182112 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-v2km9" event={"ID":"0cea36da-fd5c-416d-ab52-9500bc3fae0e","Type":"ContainerStarted","Data":"b1483a3e924d35958b47a974f07a54c5e5a493e98f542e1badc135e47e8a3f00"} Dec 11 05:26:54 crc kubenswrapper[4628]: I1211 05:26:54.182428 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-v2km9" event={"ID":"0cea36da-fd5c-416d-ab52-9500bc3fae0e","Type":"ContainerStarted","Data":"f2a5eae2d51f4c5f392b4c0e1e80bb76ba8e48dedc8bc6608257f73f1aab762b"} Dec 11 05:26:54 crc kubenswrapper[4628]: I1211 05:26:54.204512 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-v2km9" podStartSLOduration=3.204497936 podStartE2EDuration="3.204497936s" podCreationTimestamp="2025-12-11 05:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:26:54.203062637 +0000 UTC m=+716.620409345" watchObservedRunningTime="2025-12-11 05:26:54.204497936 +0000 UTC m=+716.621844634" Dec 11 05:26:55 crc kubenswrapper[4628]: I1211 05:26:55.187479 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-v2km9" Dec 11 05:26:59 crc kubenswrapper[4628]: I1211 05:26:59.212467 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" event={"ID":"abb0cb74-4b39-47f1-9a3a-cae28b6c32f6","Type":"ContainerStarted","Data":"58efd60a90d48c5b2afa92705405be10a7cccdcdaee60cc7969fca59771c0981"} Dec 11 05:26:59 crc kubenswrapper[4628]: I1211 05:26:59.213148 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" Dec 11 05:26:59 crc kubenswrapper[4628]: I1211 05:26:59.214560 4628 generic.go:334] "Generic (PLEG): container finished" podID="1c9bb5b6-577c-4b87-af2e-445ca30f9732" containerID="dbfc5572f3b4a1dcc0fbacbbcd438899732f7d2ee9b8e8d94eea63f0883d7083" exitCode=0 Dec 11 05:26:59 crc kubenswrapper[4628]: I1211 05:26:59.214605 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-97bqw" event={"ID":"1c9bb5b6-577c-4b87-af2e-445ca30f9732","Type":"ContainerDied","Data":"dbfc5572f3b4a1dcc0fbacbbcd438899732f7d2ee9b8e8d94eea63f0883d7083"} Dec 11 05:26:59 crc kubenswrapper[4628]: I1211 05:26:59.244060 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" podStartSLOduration=1.112321942 podStartE2EDuration="8.244032083s" podCreationTimestamp="2025-12-11 05:26:51 +0000 UTC" firstStartedPulling="2025-12-11 05:26:51.763266704 +0000 UTC m=+714.180613402" lastFinishedPulling="2025-12-11 05:26:58.894976835 +0000 UTC m=+721.312323543" observedRunningTime="2025-12-11 05:26:59.236326341 +0000 UTC m=+721.653673049" watchObservedRunningTime="2025-12-11 05:26:59.244032083 +0000 UTC m=+721.661378821" Dec 11 05:27:00 crc kubenswrapper[4628]: I1211 05:27:00.225266 4628 generic.go:334] "Generic (PLEG): container finished" podID="1c9bb5b6-577c-4b87-af2e-445ca30f9732" containerID="e3526da90d5de6edb143075b1d37e602e36f39a8c82ff0d5e7e6597a5f6a6b08" exitCode=0 Dec 11 05:27:00 crc kubenswrapper[4628]: I1211 05:27:00.225389 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-97bqw" event={"ID":"1c9bb5b6-577c-4b87-af2e-445ca30f9732","Type":"ContainerDied","Data":"e3526da90d5de6edb143075b1d37e602e36f39a8c82ff0d5e7e6597a5f6a6b08"} Dec 11 05:27:01 crc kubenswrapper[4628]: I1211 05:27:01.236957 4628 generic.go:334] "Generic (PLEG): container finished" podID="1c9bb5b6-577c-4b87-af2e-445ca30f9732" containerID="e202d4a86d7d88134a51eccef4881ed2130f63da4fb4ab94c29fc15597aca521" exitCode=0 Dec 11 05:27:01 crc kubenswrapper[4628]: I1211 05:27:01.237009 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-97bqw" event={"ID":"1c9bb5b6-577c-4b87-af2e-445ca30f9732","Type":"ContainerDied","Data":"e202d4a86d7d88134a51eccef4881ed2130f63da4fb4ab94c29fc15597aca521"} Dec 11 05:27:01 crc kubenswrapper[4628]: I1211 05:27:01.426704 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:27:01 crc kubenswrapper[4628]: I1211 05:27:01.426784 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:27:02 crc kubenswrapper[4628]: I1211 05:27:02.250334 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-97bqw" event={"ID":"1c9bb5b6-577c-4b87-af2e-445ca30f9732","Type":"ContainerStarted","Data":"e949dd97b07322741c6760f9b0f00ae225b4d0c5d737c30bb0298771f3fa2b84"} Dec 11 05:27:02 crc kubenswrapper[4628]: I1211 05:27:02.250403 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-97bqw" event={"ID":"1c9bb5b6-577c-4b87-af2e-445ca30f9732","Type":"ContainerStarted","Data":"4896078c9d6f5301fc74f70f0bb8e187d0e7d21b55b04fa0af48a4a9eab9008d"} Dec 11 05:27:02 crc kubenswrapper[4628]: I1211 05:27:02.250424 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-97bqw" event={"ID":"1c9bb5b6-577c-4b87-af2e-445ca30f9732","Type":"ContainerStarted","Data":"783fa92150dbfc23512bd2d268b5f2a7af5ee6ab1c3e291fb779feae521e012e"} Dec 11 05:27:02 crc kubenswrapper[4628]: I1211 05:27:02.250441 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-97bqw" event={"ID":"1c9bb5b6-577c-4b87-af2e-445ca30f9732","Type":"ContainerStarted","Data":"223b56220b7828e0a0114828d8aa35aa11f94e4e84b2a8684157a31572d907e8"} Dec 11 05:27:03 crc kubenswrapper[4628]: I1211 05:27:03.081352 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-v2km9" Dec 11 05:27:03 crc kubenswrapper[4628]: I1211 05:27:03.262785 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-97bqw" event={"ID":"1c9bb5b6-577c-4b87-af2e-445ca30f9732","Type":"ContainerStarted","Data":"54193173cb6351c4324a92cc45fb48ad93f342bd3a37b4598ac213df1d1b1880"} Dec 11 05:27:03 crc kubenswrapper[4628]: I1211 05:27:03.262840 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-97bqw" event={"ID":"1c9bb5b6-577c-4b87-af2e-445ca30f9732","Type":"ContainerStarted","Data":"77152bb0d0e3a58fbd63825c2fb757066ef5e3f9732e2ae690fc630b11e643cb"} Dec 11 05:27:03 crc kubenswrapper[4628]: I1211 05:27:03.264073 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-97bqw" Dec 11 05:27:03 crc kubenswrapper[4628]: I1211 05:27:03.292514 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-97bqw" podStartSLOduration=5.058373408 podStartE2EDuration="12.29248974s" podCreationTimestamp="2025-12-11 05:26:51 +0000 UTC" firstStartedPulling="2025-12-11 05:26:51.683367873 +0000 UTC m=+714.100714571" lastFinishedPulling="2025-12-11 05:26:58.917484195 +0000 UTC m=+721.334830903" observedRunningTime="2025-12-11 05:27:03.28886579 +0000 UTC m=+725.706212538" watchObservedRunningTime="2025-12-11 05:27:03.29248974 +0000 UTC m=+725.709836448" Dec 11 05:27:05 crc kubenswrapper[4628]: I1211 05:27:05.829097 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-pzm7q"] Dec 11 05:27:05 crc kubenswrapper[4628]: I1211 05:27:05.830069 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pzm7q" Dec 11 05:27:05 crc kubenswrapper[4628]: I1211 05:27:05.833314 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-hbfbt" Dec 11 05:27:05 crc kubenswrapper[4628]: I1211 05:27:05.833341 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 11 05:27:05 crc kubenswrapper[4628]: I1211 05:27:05.835090 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 11 05:27:05 crc kubenswrapper[4628]: I1211 05:27:05.903281 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pzm7q"] Dec 11 05:27:05 crc kubenswrapper[4628]: I1211 05:27:05.969356 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqgnn\" (UniqueName: \"kubernetes.io/projected/4ec22480-83a7-4532-9dc2-2e58db8be04f-kube-api-access-dqgnn\") pod \"openstack-operator-index-pzm7q\" (UID: \"4ec22480-83a7-4532-9dc2-2e58db8be04f\") " pod="openstack-operators/openstack-operator-index-pzm7q" Dec 11 05:27:06 crc kubenswrapper[4628]: I1211 05:27:06.071034 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqgnn\" (UniqueName: \"kubernetes.io/projected/4ec22480-83a7-4532-9dc2-2e58db8be04f-kube-api-access-dqgnn\") pod \"openstack-operator-index-pzm7q\" (UID: \"4ec22480-83a7-4532-9dc2-2e58db8be04f\") " pod="openstack-operators/openstack-operator-index-pzm7q" Dec 11 05:27:06 crc kubenswrapper[4628]: I1211 05:27:06.093212 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqgnn\" (UniqueName: \"kubernetes.io/projected/4ec22480-83a7-4532-9dc2-2e58db8be04f-kube-api-access-dqgnn\") pod \"openstack-operator-index-pzm7q\" (UID: \"4ec22480-83a7-4532-9dc2-2e58db8be04f\") " pod="openstack-operators/openstack-operator-index-pzm7q" Dec 11 05:27:06 crc kubenswrapper[4628]: I1211 05:27:06.149434 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pzm7q" Dec 11 05:27:06 crc kubenswrapper[4628]: I1211 05:27:06.355867 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pzm7q"] Dec 11 05:27:06 crc kubenswrapper[4628]: I1211 05:27:06.466864 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-97bqw" Dec 11 05:27:06 crc kubenswrapper[4628]: I1211 05:27:06.506627 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-97bqw" Dec 11 05:27:07 crc kubenswrapper[4628]: I1211 05:27:07.291803 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pzm7q" event={"ID":"4ec22480-83a7-4532-9dc2-2e58db8be04f","Type":"ContainerStarted","Data":"f448f3660fc0025a03310b77ebc7ab957a95b76bd0c25511b4987a644ef031ea"} Dec 11 05:27:09 crc kubenswrapper[4628]: I1211 05:27:09.196822 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-pzm7q"] Dec 11 05:27:09 crc kubenswrapper[4628]: I1211 05:27:09.804231 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zp8gf"] Dec 11 05:27:09 crc kubenswrapper[4628]: I1211 05:27:09.805419 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zp8gf" Dec 11 05:27:09 crc kubenswrapper[4628]: I1211 05:27:09.810161 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zp8gf"] Dec 11 05:27:09 crc kubenswrapper[4628]: I1211 05:27:09.929714 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcmmz\" (UniqueName: \"kubernetes.io/projected/589fc89a-de3e-4916-81a4-5972e3bd2410-kube-api-access-zcmmz\") pod \"openstack-operator-index-zp8gf\" (UID: \"589fc89a-de3e-4916-81a4-5972e3bd2410\") " pod="openstack-operators/openstack-operator-index-zp8gf" Dec 11 05:27:10 crc kubenswrapper[4628]: I1211 05:27:10.031220 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcmmz\" (UniqueName: \"kubernetes.io/projected/589fc89a-de3e-4916-81a4-5972e3bd2410-kube-api-access-zcmmz\") pod \"openstack-operator-index-zp8gf\" (UID: \"589fc89a-de3e-4916-81a4-5972e3bd2410\") " pod="openstack-operators/openstack-operator-index-zp8gf" Dec 11 05:27:10 crc kubenswrapper[4628]: I1211 05:27:10.056816 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcmmz\" (UniqueName: \"kubernetes.io/projected/589fc89a-de3e-4916-81a4-5972e3bd2410-kube-api-access-zcmmz\") pod \"openstack-operator-index-zp8gf\" (UID: \"589fc89a-de3e-4916-81a4-5972e3bd2410\") " pod="openstack-operators/openstack-operator-index-zp8gf" Dec 11 05:27:10 crc kubenswrapper[4628]: I1211 05:27:10.153250 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zp8gf" Dec 11 05:27:10 crc kubenswrapper[4628]: I1211 05:27:10.320808 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pzm7q" event={"ID":"4ec22480-83a7-4532-9dc2-2e58db8be04f","Type":"ContainerStarted","Data":"d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a"} Dec 11 05:27:10 crc kubenswrapper[4628]: I1211 05:27:10.320970 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-pzm7q" podUID="4ec22480-83a7-4532-9dc2-2e58db8be04f" containerName="registry-server" containerID="cri-o://d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a" gracePeriod=2 Dec 11 05:27:10 crc kubenswrapper[4628]: I1211 05:27:10.342187 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-pzm7q" podStartSLOduration=1.9382008320000002 podStartE2EDuration="5.342167368s" podCreationTimestamp="2025-12-11 05:27:05 +0000 UTC" firstStartedPulling="2025-12-11 05:27:06.372512788 +0000 UTC m=+728.789859486" lastFinishedPulling="2025-12-11 05:27:09.776479314 +0000 UTC m=+732.193826022" observedRunningTime="2025-12-11 05:27:10.337452168 +0000 UTC m=+732.754798886" watchObservedRunningTime="2025-12-11 05:27:10.342167368 +0000 UTC m=+732.759514076" Dec 11 05:27:10 crc kubenswrapper[4628]: I1211 05:27:10.449911 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zp8gf"] Dec 11 05:27:10 crc kubenswrapper[4628]: W1211 05:27:10.453758 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod589fc89a_de3e_4916_81a4_5972e3bd2410.slice/crio-68845cad966454f6b75c0811d7323f03b029ee611f8c57b00ee364c8dd14fe40 WatchSource:0}: Error finding container 68845cad966454f6b75c0811d7323f03b029ee611f8c57b00ee364c8dd14fe40: Status 404 returned error can't find the container with id 68845cad966454f6b75c0811d7323f03b029ee611f8c57b00ee364c8dd14fe40 Dec 11 05:27:10 crc kubenswrapper[4628]: I1211 05:27:10.635221 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pzm7q" Dec 11 05:27:10 crc kubenswrapper[4628]: I1211 05:27:10.746736 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqgnn\" (UniqueName: \"kubernetes.io/projected/4ec22480-83a7-4532-9dc2-2e58db8be04f-kube-api-access-dqgnn\") pod \"4ec22480-83a7-4532-9dc2-2e58db8be04f\" (UID: \"4ec22480-83a7-4532-9dc2-2e58db8be04f\") " Dec 11 05:27:10 crc kubenswrapper[4628]: I1211 05:27:10.753932 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec22480-83a7-4532-9dc2-2e58db8be04f-kube-api-access-dqgnn" (OuterVolumeSpecName: "kube-api-access-dqgnn") pod "4ec22480-83a7-4532-9dc2-2e58db8be04f" (UID: "4ec22480-83a7-4532-9dc2-2e58db8be04f"). InnerVolumeSpecName "kube-api-access-dqgnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:27:10 crc kubenswrapper[4628]: I1211 05:27:10.849925 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqgnn\" (UniqueName: \"kubernetes.io/projected/4ec22480-83a7-4532-9dc2-2e58db8be04f-kube-api-access-dqgnn\") on node \"crc\" DevicePath \"\"" Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.329558 4628 generic.go:334] "Generic (PLEG): container finished" podID="4ec22480-83a7-4532-9dc2-2e58db8be04f" containerID="d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a" exitCode=0 Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.329651 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pzm7q" event={"ID":"4ec22480-83a7-4532-9dc2-2e58db8be04f","Type":"ContainerDied","Data":"d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a"} Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.330119 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pzm7q" event={"ID":"4ec22480-83a7-4532-9dc2-2e58db8be04f","Type":"ContainerDied","Data":"f448f3660fc0025a03310b77ebc7ab957a95b76bd0c25511b4987a644ef031ea"} Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.330146 4628 scope.go:117] "RemoveContainer" containerID="d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a" Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.329673 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pzm7q" Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.332250 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zp8gf" event={"ID":"589fc89a-de3e-4916-81a4-5972e3bd2410","Type":"ContainerStarted","Data":"9076c91111ee2a7dd1c06a5ba4afd31db6bf56e98c213169adec36933dc92ec6"} Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.332269 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zp8gf" event={"ID":"589fc89a-de3e-4916-81a4-5972e3bd2410","Type":"ContainerStarted","Data":"68845cad966454f6b75c0811d7323f03b029ee611f8c57b00ee364c8dd14fe40"} Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.360445 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zp8gf" podStartSLOduration=2.313061434 podStartE2EDuration="2.360419336s" podCreationTimestamp="2025-12-11 05:27:09 +0000 UTC" firstStartedPulling="2025-12-11 05:27:10.461239912 +0000 UTC m=+732.878586610" lastFinishedPulling="2025-12-11 05:27:10.508597804 +0000 UTC m=+732.925944512" observedRunningTime="2025-12-11 05:27:11.357702181 +0000 UTC m=+733.775048909" watchObservedRunningTime="2025-12-11 05:27:11.360419336 +0000 UTC m=+733.777766104" Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.362066 4628 scope.go:117] "RemoveContainer" containerID="d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a" Dec 11 05:27:11 crc kubenswrapper[4628]: E1211 05:27:11.362711 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a\": container with ID starting with d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a not found: ID does not exist" containerID="d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a" Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.362763 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a"} err="failed to get container status \"d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a\": rpc error: code = NotFound desc = could not find container \"d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a\": container with ID starting with d47dc4ff2d3ac837743903b97065889f18d34ec39a61b4fad6074c1328f8886a not found: ID does not exist" Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.384233 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-pzm7q"] Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.393319 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-pzm7q"] Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.470264 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-97bqw" Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.481406 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-5n5bs" Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.615583 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-q4v8q" Dec 11 05:27:11 crc kubenswrapper[4628]: I1211 05:27:11.915539 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ec22480-83a7-4532-9dc2-2e58db8be04f" path="/var/lib/kubelet/pods/4ec22480-83a7-4532-9dc2-2e58db8be04f/volumes" Dec 11 05:27:20 crc kubenswrapper[4628]: I1211 05:27:20.154068 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zp8gf" Dec 11 05:27:20 crc kubenswrapper[4628]: I1211 05:27:20.154812 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zp8gf" Dec 11 05:27:20 crc kubenswrapper[4628]: I1211 05:27:20.204272 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zp8gf" Dec 11 05:27:20 crc kubenswrapper[4628]: I1211 05:27:20.451772 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zp8gf" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.456783 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7"] Dec 11 05:27:27 crc kubenswrapper[4628]: E1211 05:27:27.457954 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec22480-83a7-4532-9dc2-2e58db8be04f" containerName="registry-server" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.457973 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec22480-83a7-4532-9dc2-2e58db8be04f" containerName="registry-server" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.458189 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec22480-83a7-4532-9dc2-2e58db8be04f" containerName="registry-server" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.459480 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.463133 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-h5bv9" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.478205 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7"] Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.617932 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-util\") pod \"5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7\" (UID: \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\") " pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.617988 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzwgc\" (UniqueName: \"kubernetes.io/projected/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-kube-api-access-lzwgc\") pod \"5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7\" (UID: \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\") " pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.618019 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-bundle\") pod \"5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7\" (UID: \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\") " pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.718885 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-util\") pod \"5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7\" (UID: \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\") " pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.718944 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzwgc\" (UniqueName: \"kubernetes.io/projected/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-kube-api-access-lzwgc\") pod \"5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7\" (UID: \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\") " pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.718989 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-bundle\") pod \"5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7\" (UID: \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\") " pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.719358 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-util\") pod \"5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7\" (UID: \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\") " pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.719404 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-bundle\") pod \"5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7\" (UID: \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\") " pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.739442 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzwgc\" (UniqueName: \"kubernetes.io/projected/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-kube-api-access-lzwgc\") pod \"5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7\" (UID: \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\") " pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:27 crc kubenswrapper[4628]: I1211 05:27:27.779709 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:28 crc kubenswrapper[4628]: I1211 05:27:28.239601 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7"] Dec 11 05:27:28 crc kubenswrapper[4628]: W1211 05:27:28.247095 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5c0815_ff74_4d42_b5fa_5c3291e5f71d.slice/crio-4c6c1e5dd8f51576a3c71bec9ac81b2a345c0cd2349ac1bc30176a7e06879f6b WatchSource:0}: Error finding container 4c6c1e5dd8f51576a3c71bec9ac81b2a345c0cd2349ac1bc30176a7e06879f6b: Status 404 returned error can't find the container with id 4c6c1e5dd8f51576a3c71bec9ac81b2a345c0cd2349ac1bc30176a7e06879f6b Dec 11 05:27:28 crc kubenswrapper[4628]: I1211 05:27:28.457676 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" event={"ID":"be5c0815-ff74-4d42-b5fa-5c3291e5f71d","Type":"ContainerStarted","Data":"4c6c1e5dd8f51576a3c71bec9ac81b2a345c0cd2349ac1bc30176a7e06879f6b"} Dec 11 05:27:28 crc kubenswrapper[4628]: E1211 05:27:28.880202 4628 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5c0815_ff74_4d42_b5fa_5c3291e5f71d.slice/crio-conmon-394fc0c6925b03e194172eabe6fa2435d5df838aa693257561fafcaf6d2dfc6c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5c0815_ff74_4d42_b5fa_5c3291e5f71d.slice/crio-394fc0c6925b03e194172eabe6fa2435d5df838aa693257561fafcaf6d2dfc6c.scope\": RecentStats: unable to find data in memory cache]" Dec 11 05:27:29 crc kubenswrapper[4628]: I1211 05:27:29.467716 4628 generic.go:334] "Generic (PLEG): container finished" podID="be5c0815-ff74-4d42-b5fa-5c3291e5f71d" containerID="394fc0c6925b03e194172eabe6fa2435d5df838aa693257561fafcaf6d2dfc6c" exitCode=0 Dec 11 05:27:29 crc kubenswrapper[4628]: I1211 05:27:29.467805 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" event={"ID":"be5c0815-ff74-4d42-b5fa-5c3291e5f71d","Type":"ContainerDied","Data":"394fc0c6925b03e194172eabe6fa2435d5df838aa693257561fafcaf6d2dfc6c"} Dec 11 05:27:30 crc kubenswrapper[4628]: I1211 05:27:30.479229 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" event={"ID":"be5c0815-ff74-4d42-b5fa-5c3291e5f71d","Type":"ContainerDied","Data":"2edb97b599afe62b071d8f0a9ff4448e7dbb4e82daaa13b1252f395ee8814d28"} Dec 11 05:27:30 crc kubenswrapper[4628]: I1211 05:27:30.479148 4628 generic.go:334] "Generic (PLEG): container finished" podID="be5c0815-ff74-4d42-b5fa-5c3291e5f71d" containerID="2edb97b599afe62b071d8f0a9ff4448e7dbb4e82daaa13b1252f395ee8814d28" exitCode=0 Dec 11 05:27:31 crc kubenswrapper[4628]: I1211 05:27:31.426791 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:27:31 crc kubenswrapper[4628]: I1211 05:27:31.427317 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:27:31 crc kubenswrapper[4628]: I1211 05:27:31.491980 4628 generic.go:334] "Generic (PLEG): container finished" podID="be5c0815-ff74-4d42-b5fa-5c3291e5f71d" containerID="d1a2ca4d826b1143838d7ca1ff29d2eb056125fcc6ba275330d2082e0dc054cf" exitCode=0 Dec 11 05:27:31 crc kubenswrapper[4628]: I1211 05:27:31.492060 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" event={"ID":"be5c0815-ff74-4d42-b5fa-5c3291e5f71d","Type":"ContainerDied","Data":"d1a2ca4d826b1143838d7ca1ff29d2eb056125fcc6ba275330d2082e0dc054cf"} Dec 11 05:27:32 crc kubenswrapper[4628]: I1211 05:27:32.791478 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:32 crc kubenswrapper[4628]: I1211 05:27:32.896948 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzwgc\" (UniqueName: \"kubernetes.io/projected/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-kube-api-access-lzwgc\") pod \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\" (UID: \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\") " Dec 11 05:27:32 crc kubenswrapper[4628]: I1211 05:27:32.897114 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-util\") pod \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\" (UID: \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\") " Dec 11 05:27:32 crc kubenswrapper[4628]: I1211 05:27:32.903157 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-bundle\") pod \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\" (UID: \"be5c0815-ff74-4d42-b5fa-5c3291e5f71d\") " Dec 11 05:27:32 crc kubenswrapper[4628]: I1211 05:27:32.904004 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-bundle" (OuterVolumeSpecName: "bundle") pod "be5c0815-ff74-4d42-b5fa-5c3291e5f71d" (UID: "be5c0815-ff74-4d42-b5fa-5c3291e5f71d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:27:32 crc kubenswrapper[4628]: I1211 05:27:32.904040 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-kube-api-access-lzwgc" (OuterVolumeSpecName: "kube-api-access-lzwgc") pod "be5c0815-ff74-4d42-b5fa-5c3291e5f71d" (UID: "be5c0815-ff74-4d42-b5fa-5c3291e5f71d"). InnerVolumeSpecName "kube-api-access-lzwgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:27:32 crc kubenswrapper[4628]: I1211 05:27:32.912767 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-util" (OuterVolumeSpecName: "util") pod "be5c0815-ff74-4d42-b5fa-5c3291e5f71d" (UID: "be5c0815-ff74-4d42-b5fa-5c3291e5f71d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:27:33 crc kubenswrapper[4628]: I1211 05:27:33.005688 4628 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:27:33 crc kubenswrapper[4628]: I1211 05:27:33.005729 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzwgc\" (UniqueName: \"kubernetes.io/projected/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-kube-api-access-lzwgc\") on node \"crc\" DevicePath \"\"" Dec 11 05:27:33 crc kubenswrapper[4628]: I1211 05:27:33.005742 4628 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be5c0815-ff74-4d42-b5fa-5c3291e5f71d-util\") on node \"crc\" DevicePath \"\"" Dec 11 05:27:33 crc kubenswrapper[4628]: I1211 05:27:33.510124 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" event={"ID":"be5c0815-ff74-4d42-b5fa-5c3291e5f71d","Type":"ContainerDied","Data":"4c6c1e5dd8f51576a3c71bec9ac81b2a345c0cd2349ac1bc30176a7e06879f6b"} Dec 11 05:27:33 crc kubenswrapper[4628]: I1211 05:27:33.510602 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c6c1e5dd8f51576a3c71bec9ac81b2a345c0cd2349ac1bc30176a7e06879f6b" Dec 11 05:27:33 crc kubenswrapper[4628]: I1211 05:27:33.510149 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7" Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.269408 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b"] Dec 11 05:27:40 crc kubenswrapper[4628]: E1211 05:27:40.270233 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5c0815-ff74-4d42-b5fa-5c3291e5f71d" containerName="pull" Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.270247 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5c0815-ff74-4d42-b5fa-5c3291e5f71d" containerName="pull" Dec 11 05:27:40 crc kubenswrapper[4628]: E1211 05:27:40.270260 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5c0815-ff74-4d42-b5fa-5c3291e5f71d" containerName="util" Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.270268 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5c0815-ff74-4d42-b5fa-5c3291e5f71d" containerName="util" Dec 11 05:27:40 crc kubenswrapper[4628]: E1211 05:27:40.270279 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5c0815-ff74-4d42-b5fa-5c3291e5f71d" containerName="extract" Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.270286 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5c0815-ff74-4d42-b5fa-5c3291e5f71d" containerName="extract" Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.270418 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5c0815-ff74-4d42-b5fa-5c3291e5f71d" containerName="extract" Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.270902 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b" Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.272757 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-nvqxw" Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.285163 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b"] Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.410483 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnwnr\" (UniqueName: \"kubernetes.io/projected/91ff2419-7fdf-4656-8d3a-69295ad50387-kube-api-access-tnwnr\") pod \"openstack-operator-controller-operator-7fb58bb479-g2k9b\" (UID: \"91ff2419-7fdf-4656-8d3a-69295ad50387\") " pod="openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b" Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.511401 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnwnr\" (UniqueName: \"kubernetes.io/projected/91ff2419-7fdf-4656-8d3a-69295ad50387-kube-api-access-tnwnr\") pod \"openstack-operator-controller-operator-7fb58bb479-g2k9b\" (UID: \"91ff2419-7fdf-4656-8d3a-69295ad50387\") " pod="openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b" Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.533116 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnwnr\" (UniqueName: \"kubernetes.io/projected/91ff2419-7fdf-4656-8d3a-69295ad50387-kube-api-access-tnwnr\") pod \"openstack-operator-controller-operator-7fb58bb479-g2k9b\" (UID: \"91ff2419-7fdf-4656-8d3a-69295ad50387\") " pod="openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b" Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.589330 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b" Dec 11 05:27:40 crc kubenswrapper[4628]: I1211 05:27:40.867740 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b"] Dec 11 05:27:41 crc kubenswrapper[4628]: I1211 05:27:41.564037 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b" event={"ID":"91ff2419-7fdf-4656-8d3a-69295ad50387","Type":"ContainerStarted","Data":"27fca0878b3b44799a5b9acfab522442a1310b40413eeb8795ee52056443f2db"} Dec 11 05:27:47 crc kubenswrapper[4628]: I1211 05:27:47.609794 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b" event={"ID":"91ff2419-7fdf-4656-8d3a-69295ad50387","Type":"ContainerStarted","Data":"4925a5aec92dc6b1574da1b9e0f2b03bdeae0c80f2524b32d74742811527d176"} Dec 11 05:27:47 crc kubenswrapper[4628]: I1211 05:27:47.611199 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b" Dec 11 05:27:47 crc kubenswrapper[4628]: I1211 05:27:47.642002 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b" podStartSLOduration=1.551316818 podStartE2EDuration="7.641980258s" podCreationTimestamp="2025-12-11 05:27:40 +0000 UTC" firstStartedPulling="2025-12-11 05:27:40.884931456 +0000 UTC m=+763.302278174" lastFinishedPulling="2025-12-11 05:27:46.975594916 +0000 UTC m=+769.392941614" observedRunningTime="2025-12-11 05:27:47.639447339 +0000 UTC m=+770.056794047" watchObservedRunningTime="2025-12-11 05:27:47.641980258 +0000 UTC m=+770.059326956" Dec 11 05:28:00 crc kubenswrapper[4628]: I1211 05:28:00.592377 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7fb58bb479-g2k9b" Dec 11 05:28:01 crc kubenswrapper[4628]: I1211 05:28:01.426262 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:28:01 crc kubenswrapper[4628]: I1211 05:28:01.426319 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:28:01 crc kubenswrapper[4628]: I1211 05:28:01.426358 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:28:01 crc kubenswrapper[4628]: I1211 05:28:01.426751 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2edbf0424a7d52e635507a6262c52a38d0cf51657fa8d3615985b25f98b6c93c"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 05:28:01 crc kubenswrapper[4628]: I1211 05:28:01.426807 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://2edbf0424a7d52e635507a6262c52a38d0cf51657fa8d3615985b25f98b6c93c" gracePeriod=600 Dec 11 05:28:01 crc kubenswrapper[4628]: I1211 05:28:01.705719 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="2edbf0424a7d52e635507a6262c52a38d0cf51657fa8d3615985b25f98b6c93c" exitCode=0 Dec 11 05:28:01 crc kubenswrapper[4628]: I1211 05:28:01.705819 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"2edbf0424a7d52e635507a6262c52a38d0cf51657fa8d3615985b25f98b6c93c"} Dec 11 05:28:01 crc kubenswrapper[4628]: I1211 05:28:01.706134 4628 scope.go:117] "RemoveContainer" containerID="96488dad0283d5c27c0403cf2393677a28a1af0afc44fbcf6fbe3d10bd0060af" Dec 11 05:28:02 crc kubenswrapper[4628]: I1211 05:28:02.718170 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"4024bd10762b90d0b487ed903bd8b69e2ebeac5fe50ac7d4b3037fdf7a40c2b1"} Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.612495 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k4pfr"] Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.614317 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.639922 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4pfr"] Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.738221 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf3bfbad-c97e-47e9-9390-233f50d34f49-utilities\") pod \"certified-operators-k4pfr\" (UID: \"cf3bfbad-c97e-47e9-9390-233f50d34f49\") " pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.738272 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf3bfbad-c97e-47e9-9390-233f50d34f49-catalog-content\") pod \"certified-operators-k4pfr\" (UID: \"cf3bfbad-c97e-47e9-9390-233f50d34f49\") " pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.738334 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxwzx\" (UniqueName: \"kubernetes.io/projected/cf3bfbad-c97e-47e9-9390-233f50d34f49-kube-api-access-sxwzx\") pod \"certified-operators-k4pfr\" (UID: \"cf3bfbad-c97e-47e9-9390-233f50d34f49\") " pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.839562 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf3bfbad-c97e-47e9-9390-233f50d34f49-utilities\") pod \"certified-operators-k4pfr\" (UID: \"cf3bfbad-c97e-47e9-9390-233f50d34f49\") " pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.839623 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf3bfbad-c97e-47e9-9390-233f50d34f49-catalog-content\") pod \"certified-operators-k4pfr\" (UID: \"cf3bfbad-c97e-47e9-9390-233f50d34f49\") " pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.839658 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxwzx\" (UniqueName: \"kubernetes.io/projected/cf3bfbad-c97e-47e9-9390-233f50d34f49-kube-api-access-sxwzx\") pod \"certified-operators-k4pfr\" (UID: \"cf3bfbad-c97e-47e9-9390-233f50d34f49\") " pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.840150 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf3bfbad-c97e-47e9-9390-233f50d34f49-utilities\") pod \"certified-operators-k4pfr\" (UID: \"cf3bfbad-c97e-47e9-9390-233f50d34f49\") " pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.840268 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf3bfbad-c97e-47e9-9390-233f50d34f49-catalog-content\") pod \"certified-operators-k4pfr\" (UID: \"cf3bfbad-c97e-47e9-9390-233f50d34f49\") " pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.884699 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxwzx\" (UniqueName: \"kubernetes.io/projected/cf3bfbad-c97e-47e9-9390-233f50d34f49-kube-api-access-sxwzx\") pod \"certified-operators-k4pfr\" (UID: \"cf3bfbad-c97e-47e9-9390-233f50d34f49\") " pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:23 crc kubenswrapper[4628]: I1211 05:28:23.929146 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:24 crc kubenswrapper[4628]: I1211 05:28:24.169654 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4pfr"] Dec 11 05:28:24 crc kubenswrapper[4628]: I1211 05:28:24.851330 4628 generic.go:334] "Generic (PLEG): container finished" podID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerID="b7900c1af0c490534b2e4a8a1de0bad9c8eb5a500484d02b782e03d0d703c559" exitCode=0 Dec 11 05:28:24 crc kubenswrapper[4628]: I1211 05:28:24.851606 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4pfr" event={"ID":"cf3bfbad-c97e-47e9-9390-233f50d34f49","Type":"ContainerDied","Data":"b7900c1af0c490534b2e4a8a1de0bad9c8eb5a500484d02b782e03d0d703c559"} Dec 11 05:28:24 crc kubenswrapper[4628]: I1211 05:28:24.851632 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4pfr" event={"ID":"cf3bfbad-c97e-47e9-9390-233f50d34f49","Type":"ContainerStarted","Data":"71a1d28479e6fbe46d7aa6ad67ea5a5b40b9173e3985cb693f0e1f5892bd2f0d"} Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.325065 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.325893 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.329473 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-7xgkg" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.335126 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.336744 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.338540 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5729v" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.350688 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.363048 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.447909 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.449690 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.454167 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-z2bcp" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.456347 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jckn\" (UniqueName: \"kubernetes.io/projected/f7d58419-0988-4a35-800f-2298db8e6597-kube-api-access-9jckn\") pod \"barbican-operator-controller-manager-7d9dfd778-9d9wj\" (UID: \"f7d58419-0988-4a35-800f-2298db8e6597\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.456416 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq6gb\" (UniqueName: \"kubernetes.io/projected/43de67af-1cf5-4412-833e-e95e2ffcc47b-kube-api-access-bq6gb\") pod \"cinder-operator-controller-manager-6c677c69b-h5xhk\" (UID: \"43de67af-1cf5-4412-833e-e95e2ffcc47b\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.458670 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.472106 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.474006 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xzftd" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.491201 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.492308 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.496336 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-7g8qk" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.497714 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.512877 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.521869 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.522941 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.528257 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-ts7tq" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.537909 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.552899 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.553930 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.559152 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.559485 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7w8cl" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.562438 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jckn\" (UniqueName: \"kubernetes.io/projected/f7d58419-0988-4a35-800f-2298db8e6597-kube-api-access-9jckn\") pod \"barbican-operator-controller-manager-7d9dfd778-9d9wj\" (UID: \"f7d58419-0988-4a35-800f-2298db8e6597\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.562511 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rxkm\" (UniqueName: \"kubernetes.io/projected/2f46589d-ec5b-48e9-8f64-741a6a5b3e84-kube-api-access-7rxkm\") pod \"designate-operator-controller-manager-697fb699cf-whlx7\" (UID: \"2f46589d-ec5b-48e9-8f64-741a6a5b3e84\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.562541 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq6gb\" (UniqueName: \"kubernetes.io/projected/43de67af-1cf5-4412-833e-e95e2ffcc47b-kube-api-access-bq6gb\") pod \"cinder-operator-controller-manager-6c677c69b-h5xhk\" (UID: \"43de67af-1cf5-4412-833e-e95e2ffcc47b\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.576689 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.577748 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.584377 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-zp624" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.613936 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jckn\" (UniqueName: \"kubernetes.io/projected/f7d58419-0988-4a35-800f-2298db8e6597-kube-api-access-9jckn\") pod \"barbican-operator-controller-manager-7d9dfd778-9d9wj\" (UID: \"f7d58419-0988-4a35-800f-2298db8e6597\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.615456 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq6gb\" (UniqueName: \"kubernetes.io/projected/43de67af-1cf5-4412-833e-e95e2ffcc47b-kube-api-access-bq6gb\") pod \"cinder-operator-controller-manager-6c677c69b-h5xhk\" (UID: \"43de67af-1cf5-4412-833e-e95e2ffcc47b\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.618894 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.639101 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.640545 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.642913 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.646144 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2mpvr" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.652173 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.653901 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.662770 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.663938 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.663951 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.664501 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2525c\" (UniqueName: \"kubernetes.io/projected/f041b1fa-37ae-46fc-b6b0-301da06c1ff7-kube-api-access-2525c\") pod \"heat-operator-controller-manager-5f64f6f8bb-9bcfl\" (UID: \"f041b1fa-37ae-46fc-b6b0-301da06c1ff7\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.664622 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert\") pod \"infra-operator-controller-manager-78d48bff9d-6ff94\" (UID: \"ae8e31fb-df50-4c43-af56-9c01af34f181\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.664782 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmhfv\" (UniqueName: \"kubernetes.io/projected/dbdd3dcf-94cf-4b1e-9918-5d8efbe60360-kube-api-access-lmhfv\") pod \"glance-operator-controller-manager-5697bb5779-v2lxt\" (UID: \"dbdd3dcf-94cf-4b1e-9918-5d8efbe60360\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.665010 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qpl6\" (UniqueName: \"kubernetes.io/projected/ae8e31fb-df50-4c43-af56-9c01af34f181-kube-api-access-5qpl6\") pod \"infra-operator-controller-manager-78d48bff9d-6ff94\" (UID: \"ae8e31fb-df50-4c43-af56-9c01af34f181\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.665137 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zplb8\" (UniqueName: \"kubernetes.io/projected/232e8d69-426a-4259-93ab-1ebb4fa89a17-kube-api-access-zplb8\") pod \"horizon-operator-controller-manager-68c6d99b8f-vcz8d\" (UID: \"232e8d69-426a-4259-93ab-1ebb4fa89a17\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.665357 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rxkm\" (UniqueName: \"kubernetes.io/projected/2f46589d-ec5b-48e9-8f64-741a6a5b3e84-kube-api-access-7rxkm\") pod \"designate-operator-controller-manager-697fb699cf-whlx7\" (UID: \"2f46589d-ec5b-48e9-8f64-741a6a5b3e84\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.669076 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.669224 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-4zcw4" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.703482 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.704363 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.710447 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.714907 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-tdxp2" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.728727 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rxkm\" (UniqueName: \"kubernetes.io/projected/2f46589d-ec5b-48e9-8f64-741a6a5b3e84-kube-api-access-7rxkm\") pod \"designate-operator-controller-manager-697fb699cf-whlx7\" (UID: \"2f46589d-ec5b-48e9-8f64-741a6a5b3e84\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.730052 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.731526 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.732931 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xp4bl" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.735788 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.744064 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.758290 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-ws7gj" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.770308 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmhfv\" (UniqueName: \"kubernetes.io/projected/dbdd3dcf-94cf-4b1e-9918-5d8efbe60360-kube-api-access-lmhfv\") pod \"glance-operator-controller-manager-5697bb5779-v2lxt\" (UID: \"dbdd3dcf-94cf-4b1e-9918-5d8efbe60360\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.770347 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qpl6\" (UniqueName: \"kubernetes.io/projected/ae8e31fb-df50-4c43-af56-9c01af34f181-kube-api-access-5qpl6\") pod \"infra-operator-controller-manager-78d48bff9d-6ff94\" (UID: \"ae8e31fb-df50-4c43-af56-9c01af34f181\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.770379 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7w66\" (UniqueName: \"kubernetes.io/projected/c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2-kube-api-access-d7w66\") pod \"ironic-operator-controller-manager-967d97867-x4p8r\" (UID: \"c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.770404 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zplb8\" (UniqueName: \"kubernetes.io/projected/232e8d69-426a-4259-93ab-1ebb4fa89a17-kube-api-access-zplb8\") pod \"horizon-operator-controller-manager-68c6d99b8f-vcz8d\" (UID: \"232e8d69-426a-4259-93ab-1ebb4fa89a17\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.770424 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqbgx\" (UniqueName: \"kubernetes.io/projected/c8063e93-9008-453c-805c-487456b5e0ac-kube-api-access-sqbgx\") pod \"manila-operator-controller-manager-5b5fd79c9c-bqcdn\" (UID: \"c8063e93-9008-453c-805c-487456b5e0ac\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.770457 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwhw8\" (UniqueName: \"kubernetes.io/projected/88d0bbcc-5138-434d-811b-d8db056922cb-kube-api-access-nwhw8\") pod \"mariadb-operator-controller-manager-79c8c4686c-jqz2j\" (UID: \"88d0bbcc-5138-434d-811b-d8db056922cb\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.770479 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2525c\" (UniqueName: \"kubernetes.io/projected/f041b1fa-37ae-46fc-b6b0-301da06c1ff7-kube-api-access-2525c\") pod \"heat-operator-controller-manager-5f64f6f8bb-9bcfl\" (UID: \"f041b1fa-37ae-46fc-b6b0-301da06c1ff7\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.770498 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert\") pod \"infra-operator-controller-manager-78d48bff9d-6ff94\" (UID: \"ae8e31fb-df50-4c43-af56-9c01af34f181\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:25 crc kubenswrapper[4628]: E1211 05:28:25.770603 4628 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 11 05:28:25 crc kubenswrapper[4628]: E1211 05:28:25.770657 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert podName:ae8e31fb-df50-4c43-af56-9c01af34f181 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:26.270628107 +0000 UTC m=+808.687974795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert") pod "infra-operator-controller-manager-78d48bff9d-6ff94" (UID: "ae8e31fb-df50-4c43-af56-9c01af34f181") : secret "infra-operator-webhook-server-cert" not found Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.779730 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-vftnq"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.780764 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vftnq" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.782794 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.795277 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rb7h9" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.805858 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qpl6\" (UniqueName: \"kubernetes.io/projected/ae8e31fb-df50-4c43-af56-9c01af34f181-kube-api-access-5qpl6\") pod \"infra-operator-controller-manager-78d48bff9d-6ff94\" (UID: \"ae8e31fb-df50-4c43-af56-9c01af34f181\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.823112 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.824199 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.830431 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.837418 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bmbxt" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.853968 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2525c\" (UniqueName: \"kubernetes.io/projected/f041b1fa-37ae-46fc-b6b0-301da06c1ff7-kube-api-access-2525c\") pod \"heat-operator-controller-manager-5f64f6f8bb-9bcfl\" (UID: \"f041b1fa-37ae-46fc-b6b0-301da06c1ff7\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.857342 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmhfv\" (UniqueName: \"kubernetes.io/projected/dbdd3dcf-94cf-4b1e-9918-5d8efbe60360-kube-api-access-lmhfv\") pod \"glance-operator-controller-manager-5697bb5779-v2lxt\" (UID: \"dbdd3dcf-94cf-4b1e-9918-5d8efbe60360\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.858171 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zplb8\" (UniqueName: \"kubernetes.io/projected/232e8d69-426a-4259-93ab-1ebb4fa89a17-kube-api-access-zplb8\") pod \"horizon-operator-controller-manager-68c6d99b8f-vcz8d\" (UID: \"232e8d69-426a-4259-93ab-1ebb4fa89a17\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.872072 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7w66\" (UniqueName: \"kubernetes.io/projected/c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2-kube-api-access-d7w66\") pod \"ironic-operator-controller-manager-967d97867-x4p8r\" (UID: \"c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.872514 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6vqz\" (UniqueName: \"kubernetes.io/projected/d92dcd20-90f9-4499-bae5-f117cf41b4d5-kube-api-access-r6vqz\") pod \"nova-operator-controller-manager-697bc559fc-w5xrs\" (UID: \"d92dcd20-90f9-4499-bae5-f117cf41b4d5\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.872828 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqbgx\" (UniqueName: \"kubernetes.io/projected/c8063e93-9008-453c-805c-487456b5e0ac-kube-api-access-sqbgx\") pod \"manila-operator-controller-manager-5b5fd79c9c-bqcdn\" (UID: \"c8063e93-9008-453c-805c-487456b5e0ac\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.872963 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwhw8\" (UniqueName: \"kubernetes.io/projected/88d0bbcc-5138-434d-811b-d8db056922cb-kube-api-access-nwhw8\") pod \"mariadb-operator-controller-manager-79c8c4686c-jqz2j\" (UID: \"88d0bbcc-5138-434d-811b-d8db056922cb\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.873033 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn2kw\" (UniqueName: \"kubernetes.io/projected/d0e69cfa-5f08-4640-b9f8-b7c27ef8660f-kube-api-access-vn2kw\") pod \"keystone-operator-controller-manager-7765d96ddf-z6dn7\" (UID: \"d0e69cfa-5f08-4640-b9f8-b7c27ef8660f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.873132 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psg2k\" (UniqueName: \"kubernetes.io/projected/c2e9f8e4-3eda-4227-ad4a-8f8641f88612-kube-api-access-psg2k\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-cjb98\" (UID: \"c2e9f8e4-3eda-4227-ad4a-8f8641f88612\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.925738 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.925784 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4pfr" event={"ID":"cf3bfbad-c97e-47e9-9390-233f50d34f49","Type":"ContainerStarted","Data":"2ae8a74bfd2725209a548b5cbe768d8faada425ae4632440095fadc51a8c100f"} Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.936596 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqbgx\" (UniqueName: \"kubernetes.io/projected/c8063e93-9008-453c-805c-487456b5e0ac-kube-api-access-sqbgx\") pod \"manila-operator-controller-manager-5b5fd79c9c-bqcdn\" (UID: \"c8063e93-9008-453c-805c-487456b5e0ac\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.944785 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7w66\" (UniqueName: \"kubernetes.io/projected/c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2-kube-api-access-d7w66\") pod \"ironic-operator-controller-manager-967d97867-x4p8r\" (UID: \"c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.945530 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.948337 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwhw8\" (UniqueName: \"kubernetes.io/projected/88d0bbcc-5138-434d-811b-d8db056922cb-kube-api-access-nwhw8\") pod \"mariadb-operator-controller-manager-79c8c4686c-jqz2j\" (UID: \"88d0bbcc-5138-434d-811b-d8db056922cb\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.961489 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-vftnq"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.978061 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn2kw\" (UniqueName: \"kubernetes.io/projected/d0e69cfa-5f08-4640-b9f8-b7c27ef8660f-kube-api-access-vn2kw\") pod \"keystone-operator-controller-manager-7765d96ddf-z6dn7\" (UID: \"d0e69cfa-5f08-4640-b9f8-b7c27ef8660f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7" Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.982785 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7"] Dec 11 05:28:25 crc kubenswrapper[4628]: I1211 05:28:25.997680 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psg2k\" (UniqueName: \"kubernetes.io/projected/c2e9f8e4-3eda-4227-ad4a-8f8641f88612-kube-api-access-psg2k\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-cjb98\" (UID: \"c2e9f8e4-3eda-4227-ad4a-8f8641f88612\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:25.999806 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgck9\" (UniqueName: \"kubernetes.io/projected/a7d3410e-df7b-4de8-aa0f-4c6de9e251e7-kube-api-access-pgck9\") pod \"octavia-operator-controller-manager-998648c74-vftnq\" (UID: \"a7d3410e-df7b-4de8-aa0f-4c6de9e251e7\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-vftnq" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.002419 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.003322 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f9w8m4\" (UID: \"3112c087-1436-4f0a-8b0c-6000b07a0f77\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.003360 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz9z7\" (UniqueName: \"kubernetes.io/projected/3112c087-1436-4f0a-8b0c-6000b07a0f77-kube-api-access-dz9z7\") pod \"openstack-baremetal-operator-controller-manager-84b575879f9w8m4\" (UID: \"3112c087-1436-4f0a-8b0c-6000b07a0f77\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.003419 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6vqz\" (UniqueName: \"kubernetes.io/projected/d92dcd20-90f9-4499-bae5-f117cf41b4d5-kube-api-access-r6vqz\") pod \"nova-operator-controller-manager-697bc559fc-w5xrs\" (UID: \"d92dcd20-90f9-4499-bae5-f117cf41b4d5\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.055387 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn2kw\" (UniqueName: \"kubernetes.io/projected/d0e69cfa-5f08-4640-b9f8-b7c27ef8660f-kube-api-access-vn2kw\") pod \"keystone-operator-controller-manager-7765d96ddf-z6dn7\" (UID: \"d0e69cfa-5f08-4640-b9f8-b7c27ef8660f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.076256 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6vqz\" (UniqueName: \"kubernetes.io/projected/d92dcd20-90f9-4499-bae5-f117cf41b4d5-kube-api-access-r6vqz\") pod \"nova-operator-controller-manager-697bc559fc-w5xrs\" (UID: \"d92dcd20-90f9-4499-bae5-f117cf41b4d5\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.080673 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psg2k\" (UniqueName: \"kubernetes.io/projected/c2e9f8e4-3eda-4227-ad4a-8f8641f88612-kube-api-access-psg2k\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-cjb98\" (UID: \"c2e9f8e4-3eda-4227-ad4a-8f8641f88612\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.093719 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.095692 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.096484 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.108453 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgck9\" (UniqueName: \"kubernetes.io/projected/a7d3410e-df7b-4de8-aa0f-4c6de9e251e7-kube-api-access-pgck9\") pod \"octavia-operator-controller-manager-998648c74-vftnq\" (UID: \"a7d3410e-df7b-4de8-aa0f-4c6de9e251e7\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-vftnq" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.108503 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f9w8m4\" (UID: \"3112c087-1436-4f0a-8b0c-6000b07a0f77\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.108533 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz9z7\" (UniqueName: \"kubernetes.io/projected/3112c087-1436-4f0a-8b0c-6000b07a0f77-kube-api-access-dz9z7\") pod \"openstack-baremetal-operator-controller-manager-84b575879f9w8m4\" (UID: \"3112c087-1436-4f0a-8b0c-6000b07a0f77\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.108968 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt" Dec 11 05:28:26 crc kubenswrapper[4628]: E1211 05:28:26.109088 4628 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 05:28:26 crc kubenswrapper[4628]: E1211 05:28:26.109137 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert podName:3112c087-1436-4f0a-8b0c-6000b07a0f77 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:26.609121305 +0000 UTC m=+809.026468003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f9w8m4" (UID: "3112c087-1436-4f0a-8b0c-6000b07a0f77") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.115332 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-jfl58" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.116459 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.125130 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.130121 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.154280 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.158058 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.158540 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.164389 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgck9\" (UniqueName: \"kubernetes.io/projected/a7d3410e-df7b-4de8-aa0f-4c6de9e251e7-kube-api-access-pgck9\") pod \"octavia-operator-controller-manager-998648c74-vftnq\" (UID: \"a7d3410e-df7b-4de8-aa0f-4c6de9e251e7\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-vftnq" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.165678 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz9z7\" (UniqueName: \"kubernetes.io/projected/3112c087-1436-4f0a-8b0c-6000b07a0f77-kube-api-access-dz9z7\") pod \"openstack-baremetal-operator-controller-manager-84b575879f9w8m4\" (UID: \"3112c087-1436-4f0a-8b0c-6000b07a0f77\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.181615 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.182743 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.192819 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xk77j" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.193431 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.194469 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.197190 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.197787 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.207615 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-brfk7" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.212870 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcf8s\" (UniqueName: \"kubernetes.io/projected/bca7bee3-0202-48ba-b0e9-3353f6ab0938-kube-api-access-xcf8s\") pod \"ovn-operator-controller-manager-b6456fdb6-nc2xx\" (UID: \"bca7bee3-0202-48ba-b0e9-3353f6ab0938\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.213091 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.214236 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.224915 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-pr882" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.235873 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.237375 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.246467 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vftnq" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.284806 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.297522 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.298708 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.314965 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggcxz\" (UniqueName: \"kubernetes.io/projected/2b9ef50b-db17-4df4-a936-5a02a25f61d7-kube-api-access-ggcxz\") pod \"telemetry-operator-controller-manager-58d5ff84df-2q4b9\" (UID: \"2b9ef50b-db17-4df4-a936-5a02a25f61d7\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.315013 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert\") pod \"infra-operator-controller-manager-78d48bff9d-6ff94\" (UID: \"ae8e31fb-df50-4c43-af56-9c01af34f181\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.315034 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcf8s\" (UniqueName: \"kubernetes.io/projected/bca7bee3-0202-48ba-b0e9-3353f6ab0938-kube-api-access-xcf8s\") pod \"ovn-operator-controller-manager-b6456fdb6-nc2xx\" (UID: \"bca7bee3-0202-48ba-b0e9-3353f6ab0938\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.315081 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7bxh\" (UniqueName: \"kubernetes.io/projected/2b786de1-276f-470c-b60a-e93596dd9e47-kube-api-access-m7bxh\") pod \"swift-operator-controller-manager-9d58d64bc-zvjrq\" (UID: \"2b786de1-276f-470c-b60a-e93596dd9e47\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.315123 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9dm5\" (UniqueName: \"kubernetes.io/projected/ee29a0b0-46f9-45f6-b356-dde79504d5cc-kube-api-access-x9dm5\") pod \"placement-operator-controller-manager-78f8948974-qxsdk\" (UID: \"ee29a0b0-46f9-45f6-b356-dde79504d5cc\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk" Dec 11 05:28:26 crc kubenswrapper[4628]: E1211 05:28:26.315362 4628 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 11 05:28:26 crc kubenswrapper[4628]: E1211 05:28:26.315403 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert podName:ae8e31fb-df50-4c43-af56-9c01af34f181 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:27.315388058 +0000 UTC m=+809.732734756 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert") pod "infra-operator-controller-manager-78d48bff9d-6ff94" (UID: "ae8e31fb-df50-4c43-af56-9c01af34f181") : secret "infra-operator-webhook-server-cert" not found Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.316229 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.316997 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xj86z" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.317496 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.330124 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-r6l7m" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.339558 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.356361 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.370570 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcf8s\" (UniqueName: \"kubernetes.io/projected/bca7bee3-0202-48ba-b0e9-3353f6ab0938-kube-api-access-xcf8s\") pod \"ovn-operator-controller-manager-b6456fdb6-nc2xx\" (UID: \"bca7bee3-0202-48ba-b0e9-3353f6ab0938\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.416703 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5ssp\" (UniqueName: \"kubernetes.io/projected/9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe-kube-api-access-j5ssp\") pod \"test-operator-controller-manager-5854674fcc-tnvqg\" (UID: \"9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.416770 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7bxh\" (UniqueName: \"kubernetes.io/projected/2b786de1-276f-470c-b60a-e93596dd9e47-kube-api-access-m7bxh\") pod \"swift-operator-controller-manager-9d58d64bc-zvjrq\" (UID: \"2b786de1-276f-470c-b60a-e93596dd9e47\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.416827 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9dm5\" (UniqueName: \"kubernetes.io/projected/ee29a0b0-46f9-45f6-b356-dde79504d5cc-kube-api-access-x9dm5\") pod \"placement-operator-controller-manager-78f8948974-qxsdk\" (UID: \"ee29a0b0-46f9-45f6-b356-dde79504d5cc\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.416901 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcxz\" (UniqueName: \"kubernetes.io/projected/2b9ef50b-db17-4df4-a936-5a02a25f61d7-kube-api-access-ggcxz\") pod \"telemetry-operator-controller-manager-58d5ff84df-2q4b9\" (UID: \"2b9ef50b-db17-4df4-a936-5a02a25f61d7\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.416949 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fbhp\" (UniqueName: \"kubernetes.io/projected/53a3113c-a3d2-42c8-8ab8-b26b448a728a-kube-api-access-2fbhp\") pod \"watcher-operator-controller-manager-75944c9b7-l2wf4\" (UID: \"53a3113c-a3d2-42c8-8ab8-b26b448a728a\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.417571 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.418428 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.430742 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.431071 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.431226 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-tsl56" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.451138 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggcxz\" (UniqueName: \"kubernetes.io/projected/2b9ef50b-db17-4df4-a936-5a02a25f61d7-kube-api-access-ggcxz\") pod \"telemetry-operator-controller-manager-58d5ff84df-2q4b9\" (UID: \"2b9ef50b-db17-4df4-a936-5a02a25f61d7\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.456010 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7bxh\" (UniqueName: \"kubernetes.io/projected/2b786de1-276f-470c-b60a-e93596dd9e47-kube-api-access-m7bxh\") pod \"swift-operator-controller-manager-9d58d64bc-zvjrq\" (UID: \"2b786de1-276f-470c-b60a-e93596dd9e47\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.457738 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9dm5\" (UniqueName: \"kubernetes.io/projected/ee29a0b0-46f9-45f6-b356-dde79504d5cc-kube-api-access-x9dm5\") pod \"placement-operator-controller-manager-78f8948974-qxsdk\" (UID: \"ee29a0b0-46f9-45f6-b356-dde79504d5cc\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.481063 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.514895 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.515791 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.517771 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.524985 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fbhp\" (UniqueName: \"kubernetes.io/projected/53a3113c-a3d2-42c8-8ab8-b26b448a728a-kube-api-access-2fbhp\") pod \"watcher-operator-controller-manager-75944c9b7-l2wf4\" (UID: \"53a3113c-a3d2-42c8-8ab8-b26b448a728a\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.525082 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4sk8\" (UniqueName: \"kubernetes.io/projected/c0ac60c7-7b87-490a-9107-ad5de9864845-kube-api-access-v4sk8\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.525172 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ssp\" (UniqueName: \"kubernetes.io/projected/9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe-kube-api-access-j5ssp\") pod \"test-operator-controller-manager-5854674fcc-tnvqg\" (UID: \"9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.525303 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.535352 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.536403 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-p2qd9" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.541301 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.544994 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.570264 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.574650 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fbhp\" (UniqueName: \"kubernetes.io/projected/53a3113c-a3d2-42c8-8ab8-b26b448a728a-kube-api-access-2fbhp\") pod \"watcher-operator-controller-manager-75944c9b7-l2wf4\" (UID: \"53a3113c-a3d2-42c8-8ab8-b26b448a728a\") " pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.577537 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5ssp\" (UniqueName: \"kubernetes.io/projected/9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe-kube-api-access-j5ssp\") pod \"test-operator-controller-manager-5854674fcc-tnvqg\" (UID: \"9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.578919 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.611193 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.628291 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f9w8m4\" (UID: \"3112c087-1436-4f0a-8b0c-6000b07a0f77\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.628346 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.628377 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.628415 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zkpm\" (UniqueName: \"kubernetes.io/projected/938faeea-3048-4d4a-8f3d-e22b31c73f47-kube-api-access-5zkpm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l85kc\" (UID: \"938faeea-3048-4d4a-8f3d-e22b31c73f47\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.636177 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4sk8\" (UniqueName: \"kubernetes.io/projected/c0ac60c7-7b87-490a-9107-ad5de9864845-kube-api-access-v4sk8\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:26 crc kubenswrapper[4628]: E1211 05:28:26.644402 4628 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 11 05:28:26 crc kubenswrapper[4628]: E1211 05:28:26.644474 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs podName:c0ac60c7-7b87-490a-9107-ad5de9864845 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:27.144454862 +0000 UTC m=+809.561801560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs") pod "openstack-operator-controller-manager-7546d6d447-f9qwn" (UID: "c0ac60c7-7b87-490a-9107-ad5de9864845") : secret "metrics-server-cert" not found Dec 11 05:28:26 crc kubenswrapper[4628]: E1211 05:28:26.644791 4628 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 05:28:26 crc kubenswrapper[4628]: E1211 05:28:26.644828 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert podName:3112c087-1436-4f0a-8b0c-6000b07a0f77 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:27.644811922 +0000 UTC m=+810.062158620 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f9w8m4" (UID: "3112c087-1436-4f0a-8b0c-6000b07a0f77") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 05:28:26 crc kubenswrapper[4628]: E1211 05:28:26.644887 4628 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 11 05:28:26 crc kubenswrapper[4628]: E1211 05:28:26.644911 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs podName:c0ac60c7-7b87-490a-9107-ad5de9864845 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:27.144904244 +0000 UTC m=+809.562250942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs") pod "openstack-operator-controller-manager-7546d6d447-f9qwn" (UID: "c0ac60c7-7b87-490a-9107-ad5de9864845") : secret "webhook-server-cert" not found Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.664173 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.677297 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.683656 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4sk8\" (UniqueName: \"kubernetes.io/projected/c0ac60c7-7b87-490a-9107-ad5de9864845-kube-api-access-v4sk8\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.747708 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zkpm\" (UniqueName: \"kubernetes.io/projected/938faeea-3048-4d4a-8f3d-e22b31c73f47-kube-api-access-5zkpm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l85kc\" (UID: \"938faeea-3048-4d4a-8f3d-e22b31c73f47\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.779646 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.794563 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zkpm\" (UniqueName: \"kubernetes.io/projected/938faeea-3048-4d4a-8f3d-e22b31c73f47-kube-api-access-5zkpm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l85kc\" (UID: \"938faeea-3048-4d4a-8f3d-e22b31c73f47\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc" Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.844423 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk"] Dec 11 05:28:26 crc kubenswrapper[4628]: I1211 05:28:26.939489 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7" event={"ID":"2f46589d-ec5b-48e9-8f64-741a6a5b3e84","Type":"ContainerStarted","Data":"e1424611bd835b5525c20a61bb8c3672364d2997c9d1841dabf9463b02eada26"} Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.018209 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc" Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.094492 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r"] Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.152367 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:27 crc kubenswrapper[4628]: E1211 05:28:27.153047 4628 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 11 05:28:27 crc kubenswrapper[4628]: E1211 05:28:27.153246 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs podName:c0ac60c7-7b87-490a-9107-ad5de9864845 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:28.153226033 +0000 UTC m=+810.570572731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs") pod "openstack-operator-controller-manager-7546d6d447-f9qwn" (UID: "c0ac60c7-7b87-490a-9107-ad5de9864845") : secret "webhook-server-cert" not found Dec 11 05:28:27 crc kubenswrapper[4628]: E1211 05:28:27.153322 4628 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 11 05:28:27 crc kubenswrapper[4628]: E1211 05:28:27.153495 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs podName:c0ac60c7-7b87-490a-9107-ad5de9864845 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:28.15348601 +0000 UTC m=+810.570832708 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs") pod "openstack-operator-controller-manager-7546d6d447-f9qwn" (UID: "c0ac60c7-7b87-490a-9107-ad5de9864845") : secret "metrics-server-cert" not found Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.153636 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.175056 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn"] Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.361425 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert\") pod \"infra-operator-controller-manager-78d48bff9d-6ff94\" (UID: \"ae8e31fb-df50-4c43-af56-9c01af34f181\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:27 crc kubenswrapper[4628]: E1211 05:28:27.361621 4628 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 11 05:28:27 crc kubenswrapper[4628]: E1211 05:28:27.361671 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert podName:ae8e31fb-df50-4c43-af56-9c01af34f181 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:29.361655653 +0000 UTC m=+811.779002351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert") pod "infra-operator-controller-manager-78d48bff9d-6ff94" (UID: "ae8e31fb-df50-4c43-af56-9c01af34f181") : secret "infra-operator-webhook-server-cert" not found Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.507907 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98"] Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.564350 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl"] Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.653336 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7"] Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.660540 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs"] Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.667890 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f9w8m4\" (UID: \"3112c087-1436-4f0a-8b0c-6000b07a0f77\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:27 crc kubenswrapper[4628]: E1211 05:28:27.668032 4628 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 05:28:27 crc kubenswrapper[4628]: E1211 05:28:27.668068 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert podName:3112c087-1436-4f0a-8b0c-6000b07a0f77 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:29.668055115 +0000 UTC m=+812.085401813 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f9w8m4" (UID: "3112c087-1436-4f0a-8b0c-6000b07a0f77") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.668936 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j"] Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.905426 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-vftnq"] Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.905680 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx"] Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.933198 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt"] Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.993707 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j" event={"ID":"88d0bbcc-5138-434d-811b-d8db056922cb","Type":"ContainerStarted","Data":"d6bab6f32f15ee3f95ac18bfcdb11fad06df325e4b257774dcce0659a30411b6"} Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.999579 4628 generic.go:334] "Generic (PLEG): container finished" podID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerID="2ae8a74bfd2725209a548b5cbe768d8faada425ae4632440095fadc51a8c100f" exitCode=0 Dec 11 05:28:27 crc kubenswrapper[4628]: I1211 05:28:27.999658 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4pfr" event={"ID":"cf3bfbad-c97e-47e9-9390-233f50d34f49","Type":"ContainerDied","Data":"2ae8a74bfd2725209a548b5cbe768d8faada425ae4632440095fadc51a8c100f"} Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.008639 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs" event={"ID":"d92dcd20-90f9-4499-bae5-f117cf41b4d5","Type":"ContainerStarted","Data":"480d88f8aee6171c883e5606ad6cd608e8841be14031ca0c290c5474ebcc1488"} Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.009962 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk" event={"ID":"43de67af-1cf5-4412-833e-e95e2ffcc47b","Type":"ContainerStarted","Data":"0c75702addebd58fb9a7e52349f93a6f64d252bd4aa6b7ae8e0055fb1e8d0dcb"} Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.019888 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r" event={"ID":"c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2","Type":"ContainerStarted","Data":"b425f451e995d62688669d99021e27a1b91062b9d3f251d0369ff08b23e7fab0"} Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.021830 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx" event={"ID":"bca7bee3-0202-48ba-b0e9-3353f6ab0938","Type":"ContainerStarted","Data":"7965ee21cf886a8da19a0c4b334699f73a8487a964fb07510c28e6625b6351ba"} Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.023893 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7" event={"ID":"d0e69cfa-5f08-4640-b9f8-b7c27ef8660f","Type":"ContainerStarted","Data":"9efe371754241cc164e9774cf9cb3769aafdf46c250f4dc504aae614e1370cdb"} Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.024674 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl" event={"ID":"f041b1fa-37ae-46fc-b6b0-301da06c1ff7","Type":"ContainerStarted","Data":"1b5e04b75e9ad10d3562be505c99ea6023fc87ebb2386ace83704c45b7732977"} Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.031208 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj" event={"ID":"f7d58419-0988-4a35-800f-2298db8e6597","Type":"ContainerStarted","Data":"2288424ff21c991d0a7e210ae2d66eda2099c691f89926f668bc5a3f80fa27a2"} Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.046155 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98" event={"ID":"c2e9f8e4-3eda-4227-ad4a-8f8641f88612","Type":"ContainerStarted","Data":"d9fbe3355a045fbabb95623582fa17f245cdc54529d24bce0678d7cb42b5f614"} Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.048031 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn" event={"ID":"c8063e93-9008-453c-805c-487456b5e0ac","Type":"ContainerStarted","Data":"c7e75db9a69d4019af31d530874fffe5797b4c6dcdeebd4ae7395b441f850e70"} Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.050785 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vftnq" event={"ID":"a7d3410e-df7b-4de8-aa0f-4c6de9e251e7","Type":"ContainerStarted","Data":"1990a667ab206c7f534d1304f5d155e90af526329b8a1e8b949d482b80c0e585"} Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.076773 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk"] Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.098901 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d"] Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.108917 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc"] Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.125413 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zplb8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-vcz8d_openstack-operators(232e8d69-426a-4259-93ab-1ebb4fa89a17): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.131361 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg"] Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.136728 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9"] Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.142410 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zplb8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-vcz8d_openstack-operators(232e8d69-426a-4259-93ab-1ebb4fa89a17): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.143979 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" podUID="232e8d69-426a-4259-93ab-1ebb4fa89a17" Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.162620 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4"] Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.199439 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.199567 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.200044 4628 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.200137 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs podName:c0ac60c7-7b87-490a-9107-ad5de9864845 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:30.200109154 +0000 UTC m=+812.617455852 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs") pod "openstack-operator-controller-manager-7546d6d447-f9qwn" (UID: "c0ac60c7-7b87-490a-9107-ad5de9864845") : secret "metrics-server-cert" not found Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.200216 4628 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.200264 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs podName:c0ac60c7-7b87-490a-9107-ad5de9864845 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:30.200252017 +0000 UTC m=+812.617598715 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs") pod "openstack-operator-controller-manager-7546d6d447-f9qwn" (UID: "c0ac60c7-7b87-490a-9107-ad5de9864845") : secret "webhook-server-cert" not found Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.206454 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2fbhp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-75944c9b7-l2wf4_openstack-operators(53a3113c-a3d2-42c8-8ab8-b26b448a728a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.210577 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2fbhp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-75944c9b7-l2wf4_openstack-operators(53a3113c-a3d2-42c8-8ab8-b26b448a728a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.210722 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ggcxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-2q4b9_openstack-operators(2b9ef50b-db17-4df4-a936-5a02a25f61d7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.210832 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m7bxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-zvjrq_openstack-operators(2b786de1-276f-470c-b60a-e93596dd9e47): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.212468 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" podUID="53a3113c-a3d2-42c8-8ab8-b26b448a728a" Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.218101 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m7bxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-zvjrq_openstack-operators(2b786de1-276f-470c-b60a-e93596dd9e47): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.218776 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ggcxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-2q4b9_openstack-operators(2b9ef50b-db17-4df4-a936-5a02a25f61d7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.219526 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" podUID="2b786de1-276f-470c-b60a-e93596dd9e47" Dec 11 05:28:28 crc kubenswrapper[4628]: E1211 05:28:28.222597 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" podUID="2b9ef50b-db17-4df4-a936-5a02a25f61d7" Dec 11 05:28:28 crc kubenswrapper[4628]: I1211 05:28:28.236055 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq"] Dec 11 05:28:29 crc kubenswrapper[4628]: I1211 05:28:29.065706 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg" event={"ID":"9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe","Type":"ContainerStarted","Data":"d5f771a741d386590f4b869da97801718f4c8b1e41ed2cc2fb7fc7405d86b7cc"} Dec 11 05:28:29 crc kubenswrapper[4628]: I1211 05:28:29.101118 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" event={"ID":"53a3113c-a3d2-42c8-8ab8-b26b448a728a","Type":"ContainerStarted","Data":"4aee6897d0332364133d05500831bcc78655cc63abda0f0c00b7faedad40be2d"} Dec 11 05:28:29 crc kubenswrapper[4628]: E1211 05:28:29.112510 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" podUID="53a3113c-a3d2-42c8-8ab8-b26b448a728a" Dec 11 05:28:29 crc kubenswrapper[4628]: I1211 05:28:29.120456 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" event={"ID":"2b9ef50b-db17-4df4-a936-5a02a25f61d7","Type":"ContainerStarted","Data":"2cea349918adb8f9de143190a8e782cfbfc05c704fb161f1dd03f29a0ae97087"} Dec 11 05:28:29 crc kubenswrapper[4628]: E1211 05:28:29.130935 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" podUID="2b9ef50b-db17-4df4-a936-5a02a25f61d7" Dec 11 05:28:29 crc kubenswrapper[4628]: I1211 05:28:29.135700 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4pfr" event={"ID":"cf3bfbad-c97e-47e9-9390-233f50d34f49","Type":"ContainerStarted","Data":"0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe"} Dec 11 05:28:29 crc kubenswrapper[4628]: I1211 05:28:29.139068 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" event={"ID":"232e8d69-426a-4259-93ab-1ebb4fa89a17","Type":"ContainerStarted","Data":"d3844340f2ede5bdd4a7ca17a04164ba02338bcbb0d38f5e316a3e6ede7f6df7"} Dec 11 05:28:29 crc kubenswrapper[4628]: I1211 05:28:29.182637 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k4pfr" podStartSLOduration=2.5027242530000002 podStartE2EDuration="6.18261658s" podCreationTimestamp="2025-12-11 05:28:23 +0000 UTC" firstStartedPulling="2025-12-11 05:28:24.855217011 +0000 UTC m=+807.272563709" lastFinishedPulling="2025-12-11 05:28:28.535109338 +0000 UTC m=+810.952456036" observedRunningTime="2025-12-11 05:28:29.178305904 +0000 UTC m=+811.595652622" watchObservedRunningTime="2025-12-11 05:28:29.18261658 +0000 UTC m=+811.599963308" Dec 11 05:28:29 crc kubenswrapper[4628]: E1211 05:28:29.185075 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" podUID="232e8d69-426a-4259-93ab-1ebb4fa89a17" Dec 11 05:28:29 crc kubenswrapper[4628]: I1211 05:28:29.185524 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk" event={"ID":"ee29a0b0-46f9-45f6-b356-dde79504d5cc","Type":"ContainerStarted","Data":"67949a7746c3c1512194fd681a11fe55196251415b9c4cf1235325859ef6a9b3"} Dec 11 05:28:29 crc kubenswrapper[4628]: I1211 05:28:29.192197 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc" event={"ID":"938faeea-3048-4d4a-8f3d-e22b31c73f47","Type":"ContainerStarted","Data":"692e063b48ddfc78eeb127c5e1611f1bc539e2f15f547de6f24d5638248f92fc"} Dec 11 05:28:29 crc kubenswrapper[4628]: I1211 05:28:29.193626 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" event={"ID":"2b786de1-276f-470c-b60a-e93596dd9e47","Type":"ContainerStarted","Data":"768380f7b7c6ddd66121d4dcd5a72061a694234a715b99c55588b01b22d902cd"} Dec 11 05:28:29 crc kubenswrapper[4628]: E1211 05:28:29.201610 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" podUID="2b786de1-276f-470c-b60a-e93596dd9e47" Dec 11 05:28:29 crc kubenswrapper[4628]: I1211 05:28:29.228044 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt" event={"ID":"dbdd3dcf-94cf-4b1e-9918-5d8efbe60360","Type":"ContainerStarted","Data":"31ef7a39f2bdd17c8c7e5ba1bb525484a2b1b435155294949d9efe9f7d64dd1d"} Dec 11 05:28:29 crc kubenswrapper[4628]: I1211 05:28:29.435621 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert\") pod \"infra-operator-controller-manager-78d48bff9d-6ff94\" (UID: \"ae8e31fb-df50-4c43-af56-9c01af34f181\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:29 crc kubenswrapper[4628]: E1211 05:28:29.435763 4628 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 11 05:28:29 crc kubenswrapper[4628]: E1211 05:28:29.435811 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert podName:ae8e31fb-df50-4c43-af56-9c01af34f181 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:33.435795027 +0000 UTC m=+815.853141725 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert") pod "infra-operator-controller-manager-78d48bff9d-6ff94" (UID: "ae8e31fb-df50-4c43-af56-9c01af34f181") : secret "infra-operator-webhook-server-cert" not found Dec 11 05:28:29 crc kubenswrapper[4628]: I1211 05:28:29.758572 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f9w8m4\" (UID: \"3112c087-1436-4f0a-8b0c-6000b07a0f77\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:29 crc kubenswrapper[4628]: E1211 05:28:29.758822 4628 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 05:28:29 crc kubenswrapper[4628]: E1211 05:28:29.758884 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert podName:3112c087-1436-4f0a-8b0c-6000b07a0f77 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:33.75886828 +0000 UTC m=+816.176214978 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f9w8m4" (UID: "3112c087-1436-4f0a-8b0c-6000b07a0f77") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 05:28:30 crc kubenswrapper[4628]: I1211 05:28:30.269004 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:30 crc kubenswrapper[4628]: I1211 05:28:30.269139 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:30 crc kubenswrapper[4628]: E1211 05:28:30.269449 4628 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 11 05:28:30 crc kubenswrapper[4628]: E1211 05:28:30.269510 4628 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 11 05:28:30 crc kubenswrapper[4628]: E1211 05:28:30.269543 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs podName:c0ac60c7-7b87-490a-9107-ad5de9864845 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:34.269478539 +0000 UTC m=+816.686825237 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs") pod "openstack-operator-controller-manager-7546d6d447-f9qwn" (UID: "c0ac60c7-7b87-490a-9107-ad5de9864845") : secret "metrics-server-cert" not found Dec 11 05:28:30 crc kubenswrapper[4628]: E1211 05:28:30.269556 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs podName:c0ac60c7-7b87-490a-9107-ad5de9864845 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:34.269551361 +0000 UTC m=+816.686898059 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs") pod "openstack-operator-controller-manager-7546d6d447-f9qwn" (UID: "c0ac60c7-7b87-490a-9107-ad5de9864845") : secret "webhook-server-cert" not found Dec 11 05:28:30 crc kubenswrapper[4628]: E1211 05:28:30.270797 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" podUID="2b786de1-276f-470c-b60a-e93596dd9e47" Dec 11 05:28:30 crc kubenswrapper[4628]: E1211 05:28:30.270957 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" podUID="2b9ef50b-db17-4df4-a936-5a02a25f61d7" Dec 11 05:28:30 crc kubenswrapper[4628]: E1211 05:28:30.271078 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" podUID="53a3113c-a3d2-42c8-8ab8-b26b448a728a" Dec 11 05:28:30 crc kubenswrapper[4628]: E1211 05:28:30.271207 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" podUID="232e8d69-426a-4259-93ab-1ebb4fa89a17" Dec 11 05:28:30 crc kubenswrapper[4628]: I1211 05:28:30.978910 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2nftm"] Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:30.997313 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2nftm"] Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:30.997456 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:31.095362 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-catalog-content\") pod \"redhat-marketplace-2nftm\" (UID: \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\") " pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:31.095471 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbkdh\" (UniqueName: \"kubernetes.io/projected/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-kube-api-access-lbkdh\") pod \"redhat-marketplace-2nftm\" (UID: \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\") " pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:31.095490 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-utilities\") pod \"redhat-marketplace-2nftm\" (UID: \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\") " pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:31.197671 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-catalog-content\") pod \"redhat-marketplace-2nftm\" (UID: \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\") " pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:31.197821 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbkdh\" (UniqueName: \"kubernetes.io/projected/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-kube-api-access-lbkdh\") pod \"redhat-marketplace-2nftm\" (UID: \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\") " pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:31.197918 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-utilities\") pod \"redhat-marketplace-2nftm\" (UID: \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\") " pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:31.198520 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-utilities\") pod \"redhat-marketplace-2nftm\" (UID: \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\") " pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:31.198898 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-catalog-content\") pod \"redhat-marketplace-2nftm\" (UID: \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\") " pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:31.223298 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbkdh\" (UniqueName: \"kubernetes.io/projected/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-kube-api-access-lbkdh\") pod \"redhat-marketplace-2nftm\" (UID: \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\") " pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:31.411895 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:28:31 crc kubenswrapper[4628]: I1211 05:28:31.994770 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2nftm"] Dec 11 05:28:32 crc kubenswrapper[4628]: I1211 05:28:32.324980 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nftm" event={"ID":"d0150af6-b59b-4a9f-89da-1d6cdcd79a44","Type":"ContainerStarted","Data":"ef219b79617848f03753067cb2cb64fba14f10be55cb183bec225c1a7e254b8b"} Dec 11 05:28:33 crc kubenswrapper[4628]: I1211 05:28:33.450138 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert\") pod \"infra-operator-controller-manager-78d48bff9d-6ff94\" (UID: \"ae8e31fb-df50-4c43-af56-9c01af34f181\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:33 crc kubenswrapper[4628]: E1211 05:28:33.450290 4628 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 11 05:28:33 crc kubenswrapper[4628]: E1211 05:28:33.450345 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert podName:ae8e31fb-df50-4c43-af56-9c01af34f181 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:41.450330199 +0000 UTC m=+823.867676897 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert") pod "infra-operator-controller-manager-78d48bff9d-6ff94" (UID: "ae8e31fb-df50-4c43-af56-9c01af34f181") : secret "infra-operator-webhook-server-cert" not found Dec 11 05:28:33 crc kubenswrapper[4628]: I1211 05:28:33.856255 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f9w8m4\" (UID: \"3112c087-1436-4f0a-8b0c-6000b07a0f77\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:33 crc kubenswrapper[4628]: E1211 05:28:33.856577 4628 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 05:28:33 crc kubenswrapper[4628]: E1211 05:28:33.856965 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert podName:3112c087-1436-4f0a-8b0c-6000b07a0f77 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:41.856944225 +0000 UTC m=+824.274290923 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert") pod "openstack-baremetal-operator-controller-manager-84b575879f9w8m4" (UID: "3112c087-1436-4f0a-8b0c-6000b07a0f77") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 11 05:28:33 crc kubenswrapper[4628]: I1211 05:28:33.929869 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:33 crc kubenswrapper[4628]: I1211 05:28:33.930303 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:34 crc kubenswrapper[4628]: I1211 05:28:34.375305 4628 generic.go:334] "Generic (PLEG): container finished" podID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" containerID="936f3514df17a97987e66081133003ab791ebb88b94593b8a6b252f0e33b1533" exitCode=0 Dec 11 05:28:34 crc kubenswrapper[4628]: I1211 05:28:34.375385 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nftm" event={"ID":"d0150af6-b59b-4a9f-89da-1d6cdcd79a44","Type":"ContainerDied","Data":"936f3514df17a97987e66081133003ab791ebb88b94593b8a6b252f0e33b1533"} Dec 11 05:28:34 crc kubenswrapper[4628]: I1211 05:28:34.375974 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:34 crc kubenswrapper[4628]: I1211 05:28:34.376036 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:34 crc kubenswrapper[4628]: E1211 05:28:34.376195 4628 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 11 05:28:34 crc kubenswrapper[4628]: E1211 05:28:34.376250 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs podName:c0ac60c7-7b87-490a-9107-ad5de9864845 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:42.376234038 +0000 UTC m=+824.793580736 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs") pod "openstack-operator-controller-manager-7546d6d447-f9qwn" (UID: "c0ac60c7-7b87-490a-9107-ad5de9864845") : secret "metrics-server-cert" not found Dec 11 05:28:34 crc kubenswrapper[4628]: E1211 05:28:34.376307 4628 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 11 05:28:34 crc kubenswrapper[4628]: E1211 05:28:34.376344 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs podName:c0ac60c7-7b87-490a-9107-ad5de9864845 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:42.376332751 +0000 UTC m=+824.793679449 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs") pod "openstack-operator-controller-manager-7546d6d447-f9qwn" (UID: "c0ac60c7-7b87-490a-9107-ad5de9864845") : secret "webhook-server-cert" not found Dec 11 05:28:35 crc kubenswrapper[4628]: I1211 05:28:35.062893 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-k4pfr" podUID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerName="registry-server" probeResult="failure" output=< Dec 11 05:28:35 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 05:28:35 crc kubenswrapper[4628]: > Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.498724 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert\") pod \"infra-operator-controller-manager-78d48bff9d-6ff94\" (UID: \"ae8e31fb-df50-4c43-af56-9c01af34f181\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.513034 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae8e31fb-df50-4c43-af56-9c01af34f181-cert\") pod \"infra-operator-controller-manager-78d48bff9d-6ff94\" (UID: \"ae8e31fb-df50-4c43-af56-9c01af34f181\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.571164 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.642620 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tmsjn"] Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.644340 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.650970 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tmsjn"] Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.803179 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-catalog-content\") pod \"redhat-operators-tmsjn\" (UID: \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\") " pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.803406 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh88h\" (UniqueName: \"kubernetes.io/projected/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-kube-api-access-vh88h\") pod \"redhat-operators-tmsjn\" (UID: \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\") " pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.803449 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-utilities\") pod \"redhat-operators-tmsjn\" (UID: \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\") " pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.904422 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-catalog-content\") pod \"redhat-operators-tmsjn\" (UID: \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\") " pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.904946 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-catalog-content\") pod \"redhat-operators-tmsjn\" (UID: \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\") " pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.905048 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh88h\" (UniqueName: \"kubernetes.io/projected/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-kube-api-access-vh88h\") pod \"redhat-operators-tmsjn\" (UID: \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\") " pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.905081 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-utilities\") pod \"redhat-operators-tmsjn\" (UID: \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\") " pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.905517 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-utilities\") pod \"redhat-operators-tmsjn\" (UID: \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\") " pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.905473 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f9w8m4\" (UID: \"3112c087-1436-4f0a-8b0c-6000b07a0f77\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.934969 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3112c087-1436-4f0a-8b0c-6000b07a0f77-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879f9w8m4\" (UID: \"3112c087-1436-4f0a-8b0c-6000b07a0f77\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.958287 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh88h\" (UniqueName: \"kubernetes.io/projected/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-kube-api-access-vh88h\") pod \"redhat-operators-tmsjn\" (UID: \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\") " pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:28:41 crc kubenswrapper[4628]: I1211 05:28:41.962547 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:28:42 crc kubenswrapper[4628]: I1211 05:28:42.160325 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:28:42 crc kubenswrapper[4628]: I1211 05:28:42.411370 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:42 crc kubenswrapper[4628]: I1211 05:28:42.411488 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:42 crc kubenswrapper[4628]: E1211 05:28:42.411543 4628 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 11 05:28:42 crc kubenswrapper[4628]: E1211 05:28:42.411604 4628 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 11 05:28:42 crc kubenswrapper[4628]: E1211 05:28:42.411628 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs podName:c0ac60c7-7b87-490a-9107-ad5de9864845 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:58.411606711 +0000 UTC m=+840.828953399 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs") pod "openstack-operator-controller-manager-7546d6d447-f9qwn" (UID: "c0ac60c7-7b87-490a-9107-ad5de9864845") : secret "metrics-server-cert" not found Dec 11 05:28:42 crc kubenswrapper[4628]: E1211 05:28:42.411655 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs podName:c0ac60c7-7b87-490a-9107-ad5de9864845 nodeName:}" failed. No retries permitted until 2025-12-11 05:28:58.411637782 +0000 UTC m=+840.828984480 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs") pod "openstack-operator-controller-manager-7546d6d447-f9qwn" (UID: "c0ac60c7-7b87-490a-9107-ad5de9864845") : secret "webhook-server-cert" not found Dec 11 05:28:43 crc kubenswrapper[4628]: E1211 05:28:43.452327 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027" Dec 11 05:28:43 crc kubenswrapper[4628]: E1211 05:28:43.452537 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lmhfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5697bb5779-v2lxt_openstack-operators(dbdd3dcf-94cf-4b1e-9918-5d8efbe60360): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:28:43 crc kubenswrapper[4628]: I1211 05:28:43.968588 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:44 crc kubenswrapper[4628]: I1211 05:28:44.040481 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:28:44 crc kubenswrapper[4628]: E1211 05:28:44.214543 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 11 05:28:44 crc kubenswrapper[4628]: E1211 05:28:44.215279 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x9dm5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-qxsdk_openstack-operators(ee29a0b0-46f9-45f6-b356-dde79504d5cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:28:45 crc kubenswrapper[4628]: I1211 05:28:45.425826 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4pfr"] Dec 11 05:28:45 crc kubenswrapper[4628]: I1211 05:28:45.442369 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k4pfr" podUID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerName="registry-server" containerID="cri-o://0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe" gracePeriod=2 Dec 11 05:28:46 crc kubenswrapper[4628]: I1211 05:28:46.832131 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-95cq4"] Dec 11 05:28:46 crc kubenswrapper[4628]: I1211 05:28:46.835371 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:28:46 crc kubenswrapper[4628]: I1211 05:28:46.855400 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95cq4"] Dec 11 05:28:47 crc kubenswrapper[4628]: I1211 05:28:47.014097 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490133ce-06a3-43b2-8ad4-1e233d714c56-catalog-content\") pod \"community-operators-95cq4\" (UID: \"490133ce-06a3-43b2-8ad4-1e233d714c56\") " pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:28:47 crc kubenswrapper[4628]: I1211 05:28:47.014161 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8ntv\" (UniqueName: \"kubernetes.io/projected/490133ce-06a3-43b2-8ad4-1e233d714c56-kube-api-access-s8ntv\") pod \"community-operators-95cq4\" (UID: \"490133ce-06a3-43b2-8ad4-1e233d714c56\") " pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:28:47 crc kubenswrapper[4628]: I1211 05:28:47.014423 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490133ce-06a3-43b2-8ad4-1e233d714c56-utilities\") pod \"community-operators-95cq4\" (UID: \"490133ce-06a3-43b2-8ad4-1e233d714c56\") " pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:28:47 crc kubenswrapper[4628]: I1211 05:28:47.116225 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490133ce-06a3-43b2-8ad4-1e233d714c56-catalog-content\") pod \"community-operators-95cq4\" (UID: \"490133ce-06a3-43b2-8ad4-1e233d714c56\") " pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:28:47 crc kubenswrapper[4628]: I1211 05:28:47.116302 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8ntv\" (UniqueName: \"kubernetes.io/projected/490133ce-06a3-43b2-8ad4-1e233d714c56-kube-api-access-s8ntv\") pod \"community-operators-95cq4\" (UID: \"490133ce-06a3-43b2-8ad4-1e233d714c56\") " pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:28:47 crc kubenswrapper[4628]: I1211 05:28:47.116397 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490133ce-06a3-43b2-8ad4-1e233d714c56-utilities\") pod \"community-operators-95cq4\" (UID: \"490133ce-06a3-43b2-8ad4-1e233d714c56\") " pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:28:47 crc kubenswrapper[4628]: I1211 05:28:47.116948 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490133ce-06a3-43b2-8ad4-1e233d714c56-catalog-content\") pod \"community-operators-95cq4\" (UID: \"490133ce-06a3-43b2-8ad4-1e233d714c56\") " pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:28:47 crc kubenswrapper[4628]: I1211 05:28:47.116972 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490133ce-06a3-43b2-8ad4-1e233d714c56-utilities\") pod \"community-operators-95cq4\" (UID: \"490133ce-06a3-43b2-8ad4-1e233d714c56\") " pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:28:47 crc kubenswrapper[4628]: I1211 05:28:47.133626 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8ntv\" (UniqueName: \"kubernetes.io/projected/490133ce-06a3-43b2-8ad4-1e233d714c56-kube-api-access-s8ntv\") pod \"community-operators-95cq4\" (UID: \"490133ce-06a3-43b2-8ad4-1e233d714c56\") " pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:28:47 crc kubenswrapper[4628]: I1211 05:28:47.221863 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:28:47 crc kubenswrapper[4628]: I1211 05:28:47.456818 4628 generic.go:334] "Generic (PLEG): container finished" podID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerID="0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe" exitCode=0 Dec 11 05:28:47 crc kubenswrapper[4628]: I1211 05:28:47.456891 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4pfr" event={"ID":"cf3bfbad-c97e-47e9-9390-233f50d34f49","Type":"ContainerDied","Data":"0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe"} Dec 11 05:28:48 crc kubenswrapper[4628]: E1211 05:28:48.046306 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 11 05:28:48 crc kubenswrapper[4628]: E1211 05:28:48.046483 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2525c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-9bcfl_openstack-operators(f041b1fa-37ae-46fc-b6b0-301da06c1ff7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:28:48 crc kubenswrapper[4628]: E1211 05:28:48.683004 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 11 05:28:48 crc kubenswrapper[4628]: E1211 05:28:48.683182 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pgck9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-vftnq_openstack-operators(a7d3410e-df7b-4de8-aa0f-4c6de9e251e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:28:50 crc kubenswrapper[4628]: E1211 05:28:50.522435 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 11 05:28:50 crc kubenswrapper[4628]: E1211 05:28:50.524307 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j5ssp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-tnvqg_openstack-operators(9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:28:51 crc kubenswrapper[4628]: E1211 05:28:51.174618 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 11 05:28:51 crc kubenswrapper[4628]: E1211 05:28:51.175167 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xcf8s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-nc2xx_openstack-operators(bca7bee3-0202-48ba-b0e9-3353f6ab0938): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:28:52 crc kubenswrapper[4628]: E1211 05:28:52.848412 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a" Dec 11 05:28:52 crc kubenswrapper[4628]: E1211 05:28:52.849252 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sqbgx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5b5fd79c9c-bqcdn_openstack-operators(c8063e93-9008-453c-805c-487456b5e0ac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:28:53 crc kubenswrapper[4628]: E1211 05:28:53.930445 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe is running failed: container process not found" containerID="0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe" cmd=["grpc_health_probe","-addr=:50051"] Dec 11 05:28:53 crc kubenswrapper[4628]: E1211 05:28:53.931321 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe is running failed: container process not found" containerID="0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe" cmd=["grpc_health_probe","-addr=:50051"] Dec 11 05:28:53 crc kubenswrapper[4628]: E1211 05:28:53.931805 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe is running failed: container process not found" containerID="0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe" cmd=["grpc_health_probe","-addr=:50051"] Dec 11 05:28:53 crc kubenswrapper[4628]: E1211 05:28:53.931882 4628 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-k4pfr" podUID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerName="registry-server" Dec 11 05:28:58 crc kubenswrapper[4628]: I1211 05:28:58.413981 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:58 crc kubenswrapper[4628]: I1211 05:28:58.414528 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:58 crc kubenswrapper[4628]: I1211 05:28:58.430800 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-metrics-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:58 crc kubenswrapper[4628]: I1211 05:28:58.713140 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c0ac60c7-7b87-490a-9107-ad5de9864845-webhook-certs\") pod \"openstack-operator-controller-manager-7546d6d447-f9qwn\" (UID: \"c0ac60c7-7b87-490a-9107-ad5de9864845\") " pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:58 crc kubenswrapper[4628]: I1211 05:28:58.885838 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-tsl56" Dec 11 05:28:58 crc kubenswrapper[4628]: I1211 05:28:58.894119 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:28:59 crc kubenswrapper[4628]: E1211 05:28:59.408767 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87" Dec 11 05:28:59 crc kubenswrapper[4628]: E1211 05:28:59.409110 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d7w66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-967d97867-x4p8r_openstack-operators(c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:29:00 crc kubenswrapper[4628]: E1211 05:29:00.001859 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991" Dec 11 05:29:00 crc kubenswrapper[4628]: E1211 05:29:00.002432 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m7bxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-zvjrq_openstack-operators(2b786de1-276f-470c-b60a-e93596dd9e47): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:29:00 crc kubenswrapper[4628]: E1211 05:29:00.559086 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 11 05:29:00 crc kubenswrapper[4628]: E1211 05:29:00.559286 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zplb8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-vcz8d_openstack-operators(232e8d69-426a-4259-93ab-1ebb4fa89a17): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:29:03 crc kubenswrapper[4628]: E1211 05:29:03.187935 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a" Dec 11 05:29:03 crc kubenswrapper[4628]: E1211 05:29:03.188368 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2fbhp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-75944c9b7-l2wf4_openstack-operators(53a3113c-a3d2-42c8-8ab8-b26b448a728a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:29:03 crc kubenswrapper[4628]: E1211 05:29:03.726500 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f" Dec 11 05:29:03 crc kubenswrapper[4628]: E1211 05:29:03.726718 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ggcxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-2q4b9_openstack-operators(2b9ef50b-db17-4df4-a936-5a02a25f61d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:29:03 crc kubenswrapper[4628]: E1211 05:29:03.929816 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe is running failed: container process not found" containerID="0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe" cmd=["grpc_health_probe","-addr=:50051"] Dec 11 05:29:03 crc kubenswrapper[4628]: E1211 05:29:03.930435 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe is running failed: container process not found" containerID="0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe" cmd=["grpc_health_probe","-addr=:50051"] Dec 11 05:29:03 crc kubenswrapper[4628]: E1211 05:29:03.930715 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe is running failed: container process not found" containerID="0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe" cmd=["grpc_health_probe","-addr=:50051"] Dec 11 05:29:03 crc kubenswrapper[4628]: E1211 05:29:03.930774 4628 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-k4pfr" podUID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerName="registry-server" Dec 11 05:29:04 crc kubenswrapper[4628]: E1211 05:29:04.475360 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 11 05:29:04 crc kubenswrapper[4628]: E1211 05:29:04.475907 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vn2kw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-z6dn7_openstack-operators(d0e69cfa-5f08-4640-b9f8-b7c27ef8660f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:29:04 crc kubenswrapper[4628]: E1211 05:29:04.918090 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 11 05:29:04 crc kubenswrapper[4628]: E1211 05:29:04.918283 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5zkpm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-l85kc_openstack-operators(938faeea-3048-4d4a-8f3d-e22b31c73f47): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:29:04 crc kubenswrapper[4628]: E1211 05:29:04.919622 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc" podUID="938faeea-3048-4d4a-8f3d-e22b31c73f47" Dec 11 05:29:05 crc kubenswrapper[4628]: E1211 05:29:05.566872 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc" podUID="938faeea-3048-4d4a-8f3d-e22b31c73f47" Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.474420 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.489379 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxwzx\" (UniqueName: \"kubernetes.io/projected/cf3bfbad-c97e-47e9-9390-233f50d34f49-kube-api-access-sxwzx\") pod \"cf3bfbad-c97e-47e9-9390-233f50d34f49\" (UID: \"cf3bfbad-c97e-47e9-9390-233f50d34f49\") " Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.489491 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf3bfbad-c97e-47e9-9390-233f50d34f49-utilities\") pod \"cf3bfbad-c97e-47e9-9390-233f50d34f49\" (UID: \"cf3bfbad-c97e-47e9-9390-233f50d34f49\") " Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.489542 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf3bfbad-c97e-47e9-9390-233f50d34f49-catalog-content\") pod \"cf3bfbad-c97e-47e9-9390-233f50d34f49\" (UID: \"cf3bfbad-c97e-47e9-9390-233f50d34f49\") " Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.490800 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf3bfbad-c97e-47e9-9390-233f50d34f49-utilities" (OuterVolumeSpecName: "utilities") pod "cf3bfbad-c97e-47e9-9390-233f50d34f49" (UID: "cf3bfbad-c97e-47e9-9390-233f50d34f49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.493209 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf3bfbad-c97e-47e9-9390-233f50d34f49-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.513103 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf3bfbad-c97e-47e9-9390-233f50d34f49-kube-api-access-sxwzx" (OuterVolumeSpecName: "kube-api-access-sxwzx") pod "cf3bfbad-c97e-47e9-9390-233f50d34f49" (UID: "cf3bfbad-c97e-47e9-9390-233f50d34f49"). InnerVolumeSpecName "kube-api-access-sxwzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.551130 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf3bfbad-c97e-47e9-9390-233f50d34f49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf3bfbad-c97e-47e9-9390-233f50d34f49" (UID: "cf3bfbad-c97e-47e9-9390-233f50d34f49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.594928 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxwzx\" (UniqueName: \"kubernetes.io/projected/cf3bfbad-c97e-47e9-9390-233f50d34f49-kube-api-access-sxwzx\") on node \"crc\" DevicePath \"\"" Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.595273 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf3bfbad-c97e-47e9-9390-233f50d34f49-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.595413 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4pfr" event={"ID":"cf3bfbad-c97e-47e9-9390-233f50d34f49","Type":"ContainerDied","Data":"71a1d28479e6fbe46d7aa6ad67ea5a5b40b9173e3985cb693f0e1f5892bd2f0d"} Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.595463 4628 scope.go:117] "RemoveContainer" containerID="0734e33f4bf5c0a6aab5e78f04eea70f02f7055bb8344726fae01574c125f1fe" Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.595610 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4pfr" Dec 11 05:29:07 crc kubenswrapper[4628]: E1211 05:29:07.618286 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 11 05:29:07 crc kubenswrapper[4628]: E1211 05:29:07.618433 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r6vqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-w5xrs_openstack-operators(d92dcd20-90f9-4499-bae5-f117cf41b4d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.653016 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4pfr"] Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.658217 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k4pfr"] Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.912764 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf3bfbad-c97e-47e9-9390-233f50d34f49" path="/var/lib/kubelet/pods/cf3bfbad-c97e-47e9-9390-233f50d34f49/volumes" Dec 11 05:29:07 crc kubenswrapper[4628]: I1211 05:29:07.918304 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tmsjn"] Dec 11 05:29:08 crc kubenswrapper[4628]: I1211 05:29:08.131240 4628 scope.go:117] "RemoveContainer" containerID="2ae8a74bfd2725209a548b5cbe768d8faada425ae4632440095fadc51a8c100f" Dec 11 05:29:08 crc kubenswrapper[4628]: I1211 05:29:08.132645 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4"] Dec 11 05:29:08 crc kubenswrapper[4628]: I1211 05:29:08.407897 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94"] Dec 11 05:29:08 crc kubenswrapper[4628]: I1211 05:29:08.481286 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95cq4"] Dec 11 05:29:08 crc kubenswrapper[4628]: I1211 05:29:08.597524 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn"] Dec 11 05:29:08 crc kubenswrapper[4628]: I1211 05:29:08.616128 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk" event={"ID":"43de67af-1cf5-4412-833e-e95e2ffcc47b","Type":"ContainerStarted","Data":"d92c117b692393c6294919d133393d8660e94a3d40f0fd7740ebeb666dda4d07"} Dec 11 05:29:08 crc kubenswrapper[4628]: I1211 05:29:08.618954 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj" event={"ID":"f7d58419-0988-4a35-800f-2298db8e6597","Type":"ContainerStarted","Data":"68722e9f0be6469fb2665262b7f84a2825a044fb974ad6985dae44162b25af72"} Dec 11 05:29:08 crc kubenswrapper[4628]: I1211 05:29:08.621510 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmsjn" event={"ID":"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce","Type":"ContainerStarted","Data":"e2a960a38a48354c53eb8e6bc01470dcf889b5d6b04501c2a30a47794070b294"} Dec 11 05:29:08 crc kubenswrapper[4628]: W1211 05:29:08.734103 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod490133ce_06a3_43b2_8ad4_1e233d714c56.slice/crio-37457f348c20be4962932a279e06c61cee5eb5114343633951792640314d3f31 WatchSource:0}: Error finding container 37457f348c20be4962932a279e06c61cee5eb5114343633951792640314d3f31: Status 404 returned error can't find the container with id 37457f348c20be4962932a279e06c61cee5eb5114343633951792640314d3f31 Dec 11 05:29:08 crc kubenswrapper[4628]: W1211 05:29:08.737895 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae8e31fb_df50_4c43_af56_9c01af34f181.slice/crio-93e24c7e083a5068c89e8be6fbb342492df8211530cb0a9b33a8e48ca68ac829 WatchSource:0}: Error finding container 93e24c7e083a5068c89e8be6fbb342492df8211530cb0a9b33a8e48ca68ac829: Status 404 returned error can't find the container with id 93e24c7e083a5068c89e8be6fbb342492df8211530cb0a9b33a8e48ca68ac829 Dec 11 05:29:08 crc kubenswrapper[4628]: I1211 05:29:08.872136 4628 scope.go:117] "RemoveContainer" containerID="b7900c1af0c490534b2e4a8a1de0bad9c8eb5a500484d02b782e03d0d703c559" Dec 11 05:29:09 crc kubenswrapper[4628]: I1211 05:29:09.628693 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98" event={"ID":"c2e9f8e4-3eda-4227-ad4a-8f8641f88612","Type":"ContainerStarted","Data":"1707205a70a5a5bb9f830215780a414ffd7abbe83271cee6255951368480daa3"} Dec 11 05:29:09 crc kubenswrapper[4628]: I1211 05:29:09.632241 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" event={"ID":"3112c087-1436-4f0a-8b0c-6000b07a0f77","Type":"ContainerStarted","Data":"b6c2db0918d0ef1cdfba8539c7fc761607527d10a03fe17f60d428c6cb150ca8"} Dec 11 05:29:09 crc kubenswrapper[4628]: I1211 05:29:09.633732 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" event={"ID":"c0ac60c7-7b87-490a-9107-ad5de9864845","Type":"ContainerStarted","Data":"6033de41a6799a86f88c12b3579a2702314f9e0b7556a0d5ad414a80f81a4c02"} Dec 11 05:29:09 crc kubenswrapper[4628]: I1211 05:29:09.636100 4628 generic.go:334] "Generic (PLEG): container finished" podID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" containerID="c380ef520a31624689a27a1ecd0902d602da6b78c6e241a29f681dda2bfe8167" exitCode=0 Dec 11 05:29:09 crc kubenswrapper[4628]: I1211 05:29:09.636146 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nftm" event={"ID":"d0150af6-b59b-4a9f-89da-1d6cdcd79a44","Type":"ContainerDied","Data":"c380ef520a31624689a27a1ecd0902d602da6b78c6e241a29f681dda2bfe8167"} Dec 11 05:29:09 crc kubenswrapper[4628]: I1211 05:29:09.638482 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" event={"ID":"ae8e31fb-df50-4c43-af56-9c01af34f181","Type":"ContainerStarted","Data":"93e24c7e083a5068c89e8be6fbb342492df8211530cb0a9b33a8e48ca68ac829"} Dec 11 05:29:09 crc kubenswrapper[4628]: I1211 05:29:09.640563 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j" event={"ID":"88d0bbcc-5138-434d-811b-d8db056922cb","Type":"ContainerStarted","Data":"8e7cc862c2cc614d22f365c2b633ccaa38d748068fb6a00f802b188b4c09a300"} Dec 11 05:29:09 crc kubenswrapper[4628]: I1211 05:29:09.641736 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95cq4" event={"ID":"490133ce-06a3-43b2-8ad4-1e233d714c56","Type":"ContainerStarted","Data":"37457f348c20be4962932a279e06c61cee5eb5114343633951792640314d3f31"} Dec 11 05:29:09 crc kubenswrapper[4628]: I1211 05:29:09.644866 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7" event={"ID":"2f46589d-ec5b-48e9-8f64-741a6a5b3e84","Type":"ContainerStarted","Data":"6e23e1b53a196459f67c2e3770467d8765d8642da05a683bddce436faaa30540"} Dec 11 05:29:13 crc kubenswrapper[4628]: I1211 05:29:13.706871 4628 generic.go:334] "Generic (PLEG): container finished" podID="490133ce-06a3-43b2-8ad4-1e233d714c56" containerID="dbbb24fe82848cebb2f5069273ce82a0e6008b0a8bc6f4798fa38ab15be7ad95" exitCode=0 Dec 11 05:29:13 crc kubenswrapper[4628]: I1211 05:29:13.707163 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95cq4" event={"ID":"490133ce-06a3-43b2-8ad4-1e233d714c56","Type":"ContainerDied","Data":"dbbb24fe82848cebb2f5069273ce82a0e6008b0a8bc6f4798fa38ab15be7ad95"} Dec 11 05:29:13 crc kubenswrapper[4628]: I1211 05:29:13.716129 4628 generic.go:334] "Generic (PLEG): container finished" podID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" containerID="9c2cdcf43a8ec7bdfb079ae80f8215df8427eb126a8633beea12f67b2c0af338" exitCode=0 Dec 11 05:29:13 crc kubenswrapper[4628]: I1211 05:29:13.716167 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmsjn" event={"ID":"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce","Type":"ContainerDied","Data":"9c2cdcf43a8ec7bdfb079ae80f8215df8427eb126a8633beea12f67b2c0af338"} Dec 11 05:29:14 crc kubenswrapper[4628]: I1211 05:29:14.062187 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 05:29:14 crc kubenswrapper[4628]: E1211 05:29:14.078171 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 11 05:29:14 crc kubenswrapper[4628]: E1211 05:29:14.078375 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lmhfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5697bb5779-v2lxt_openstack-operators(dbdd3dcf-94cf-4b1e-9918-5d8efbe60360): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:29:14 crc kubenswrapper[4628]: E1211 05:29:14.080694 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt" podUID="dbdd3dcf-94cf-4b1e-9918-5d8efbe60360" Dec 11 05:29:14 crc kubenswrapper[4628]: E1211 05:29:14.490391 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk" podUID="ee29a0b0-46f9-45f6-b356-dde79504d5cc" Dec 11 05:29:14 crc kubenswrapper[4628]: E1211 05:29:14.547527 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx" podUID="bca7bee3-0202-48ba-b0e9-3353f6ab0938" Dec 11 05:29:14 crc kubenswrapper[4628]: E1211 05:29:14.563785 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg" podUID="9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe" Dec 11 05:29:14 crc kubenswrapper[4628]: E1211 05:29:14.584191 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn" podUID="c8063e93-9008-453c-805c-487456b5e0ac" Dec 11 05:29:14 crc kubenswrapper[4628]: E1211 05:29:14.600933 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" podUID="2b9ef50b-db17-4df4-a936-5a02a25f61d7" Dec 11 05:29:14 crc kubenswrapper[4628]: I1211 05:29:14.729317 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx" event={"ID":"bca7bee3-0202-48ba-b0e9-3353f6ab0938","Type":"ContainerStarted","Data":"1e704f97a625de2bf4eba5287d84ff749d99e37cf2d9fb52c8ab931ba37fc570"} Dec 11 05:29:14 crc kubenswrapper[4628]: I1211 05:29:14.733661 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" event={"ID":"2b9ef50b-db17-4df4-a936-5a02a25f61d7","Type":"ContainerStarted","Data":"8676bfa35f41faba3c2d8e5b8a58ae04a6f306cd802f8845036c67571d79f885"} Dec 11 05:29:14 crc kubenswrapper[4628]: E1211 05:29:14.738981 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" podUID="2b9ef50b-db17-4df4-a936-5a02a25f61d7" Dec 11 05:29:14 crc kubenswrapper[4628]: I1211 05:29:14.744175 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn" event={"ID":"c8063e93-9008-453c-805c-487456b5e0ac","Type":"ContainerStarted","Data":"429763b6c10c5d6d9c44a3864c31930df53ab081c373d2e165b022d0403a0a32"} Dec 11 05:29:14 crc kubenswrapper[4628]: I1211 05:29:14.755484 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" event={"ID":"c0ac60c7-7b87-490a-9107-ad5de9864845","Type":"ContainerStarted","Data":"bcbaef0b1546bc01f436dd6b159d13ca9d128fca0bb6b8a6678ba83513fef3ad"} Dec 11 05:29:14 crc kubenswrapper[4628]: I1211 05:29:14.769694 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:29:14 crc kubenswrapper[4628]: I1211 05:29:14.820087 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nftm" event={"ID":"d0150af6-b59b-4a9f-89da-1d6cdcd79a44","Type":"ContainerStarted","Data":"ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f"} Dec 11 05:29:14 crc kubenswrapper[4628]: I1211 05:29:14.821926 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk" event={"ID":"ee29a0b0-46f9-45f6-b356-dde79504d5cc","Type":"ContainerStarted","Data":"d4883caad9a54fedc675aefda301aac8f9840b6d6a1ce6c41e2c15261d63a8b2"} Dec 11 05:29:14 crc kubenswrapper[4628]: I1211 05:29:14.823973 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg" event={"ID":"9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe","Type":"ContainerStarted","Data":"bd1ef8bfd9a932870667d69f4b0c762e9e1963ba21263f1517723be67947a1ab"} Dec 11 05:29:14 crc kubenswrapper[4628]: E1211 05:29:14.836696 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" podUID="53a3113c-a3d2-42c8-8ab8-b26b448a728a" Dec 11 05:29:14 crc kubenswrapper[4628]: E1211 05:29:14.861984 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7" podUID="d0e69cfa-5f08-4640-b9f8-b7c27ef8660f" Dec 11 05:29:14 crc kubenswrapper[4628]: I1211 05:29:14.869604 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" podStartSLOduration=48.869585459 podStartE2EDuration="48.869585459s" podCreationTimestamp="2025-12-11 05:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:29:14.831564532 +0000 UTC m=+857.248911230" watchObservedRunningTime="2025-12-11 05:29:14.869585459 +0000 UTC m=+857.286932157" Dec 11 05:29:14 crc kubenswrapper[4628]: I1211 05:29:14.891234 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2nftm" podStartSLOduration=7.8698906619999995 podStartE2EDuration="44.89121918s" podCreationTimestamp="2025-12-11 05:28:30 +0000 UTC" firstStartedPulling="2025-12-11 05:28:37.067737051 +0000 UTC m=+819.485083749" lastFinishedPulling="2025-12-11 05:29:14.089065569 +0000 UTC m=+856.506412267" observedRunningTime="2025-12-11 05:29:14.887878938 +0000 UTC m=+857.305225636" watchObservedRunningTime="2025-12-11 05:29:14.89121918 +0000 UTC m=+857.308565868" Dec 11 05:29:14 crc kubenswrapper[4628]: E1211 05:29:14.984970 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vftnq" podUID="a7d3410e-df7b-4de8-aa0f-4c6de9e251e7" Dec 11 05:29:15 crc kubenswrapper[4628]: E1211 05:29:15.110891 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" podUID="232e8d69-426a-4259-93ab-1ebb4fa89a17" Dec 11 05:29:15 crc kubenswrapper[4628]: E1211 05:29:15.455239 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl" podUID="f041b1fa-37ae-46fc-b6b0-301da06c1ff7" Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.873167 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk" event={"ID":"43de67af-1cf5-4412-833e-e95e2ffcc47b","Type":"ContainerStarted","Data":"291982d01d0eb1fadb9b04a372fc9b245ec4c6086c55cf477eff780fc1e5bd8a"} Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.876091 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk" Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.878791 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk" Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.879597 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx" Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.884086 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmsjn" event={"ID":"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce","Type":"ContainerStarted","Data":"f18ba36bbaed416e5f4c55f3e23b99bebe604dafd70ea7cb9fdeeceb4c98c86f"} Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.885820 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn" event={"ID":"c8063e93-9008-453c-805c-487456b5e0ac","Type":"ContainerStarted","Data":"527cf983c70f7a576a3067bdcd808d91fc9ed796c449566058d14a3832a9b53f"} Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.886205 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn" Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.892712 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-h5xhk" podStartSLOduration=2.952249433 podStartE2EDuration="50.892695277s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:26.943425655 +0000 UTC m=+809.360772353" lastFinishedPulling="2025-12-11 05:29:14.883871499 +0000 UTC m=+857.301218197" observedRunningTime="2025-12-11 05:29:15.887012272 +0000 UTC m=+858.304358970" watchObservedRunningTime="2025-12-11 05:29:15.892695277 +0000 UTC m=+858.310041975" Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.914913 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vftnq" event={"ID":"a7d3410e-df7b-4de8-aa0f-4c6de9e251e7","Type":"ContainerStarted","Data":"b0df431f1e058ed99f253c3d690ef904b5f9ef1f2276e930766ec5e059f67d0b"} Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.914948 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j" event={"ID":"88d0bbcc-5138-434d-811b-d8db056922cb","Type":"ContainerStarted","Data":"97b503faa88ca91e56ee61c8830ba37cbd224067250e37ccf254fe03ba34e9c1"} Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.914965 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j" Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.915004 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j" Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.925996 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" event={"ID":"232e8d69-426a-4259-93ab-1ebb4fa89a17","Type":"ContainerStarted","Data":"44302a079c58152ada35241c35a84092f662d29bf9ff81bf9c4803c6a02f9d01"} Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.931103 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl" event={"ID":"f041b1fa-37ae-46fc-b6b0-301da06c1ff7","Type":"ContainerStarted","Data":"8449f6930ef1df566945656a5bd359656a04513556dbaf52bf64550646bf1e0a"} Dec 11 05:29:15 crc kubenswrapper[4628]: E1211 05:29:15.931658 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" podUID="232e8d69-426a-4259-93ab-1ebb4fa89a17" Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.937005 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj" event={"ID":"f7d58419-0988-4a35-800f-2298db8e6597","Type":"ContainerStarted","Data":"5481fb9c3b2bb1778f31b14f622ae5256551532389f48f11da18ed567eaebc04"} Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.938625 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj" Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.941483 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj" Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.941786 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx" podStartSLOduration=3.305892686 podStartE2EDuration="50.941765756s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:27.922356954 +0000 UTC m=+810.339703652" lastFinishedPulling="2025-12-11 05:29:15.558230034 +0000 UTC m=+857.975576722" observedRunningTime="2025-12-11 05:29:15.933507771 +0000 UTC m=+858.350854469" watchObservedRunningTime="2025-12-11 05:29:15.941765756 +0000 UTC m=+858.359112454" Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.946757 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7" event={"ID":"d0e69cfa-5f08-4640-b9f8-b7c27ef8660f","Type":"ContainerStarted","Data":"5ada14e0560c4404e404e4b49cfab283d7a0dd97d86569731f4f46fba8dd0cf1"} Dec 11 05:29:15 crc kubenswrapper[4628]: I1211 05:29:15.970675 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" event={"ID":"53a3113c-a3d2-42c8-8ab8-b26b448a728a","Type":"ContainerStarted","Data":"f224af784fd4e37a2210d9cd59bbe3310ac1004291df87bdc0a15c892a6c4a0b"} Dec 11 05:29:16 crc kubenswrapper[4628]: I1211 05:29:16.004270 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn" podStartSLOduration=2.843632466 podStartE2EDuration="51.00425552s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:27.248701697 +0000 UTC m=+809.666048395" lastFinishedPulling="2025-12-11 05:29:15.409324751 +0000 UTC m=+857.826671449" observedRunningTime="2025-12-11 05:29:15.990792733 +0000 UTC m=+858.408139441" watchObservedRunningTime="2025-12-11 05:29:16.00425552 +0000 UTC m=+858.421602218" Dec 11 05:29:16 crc kubenswrapper[4628]: I1211 05:29:16.028446 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9d9wj" podStartSLOduration=2.889165801 podStartE2EDuration="51.028430599s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:26.943072785 +0000 UTC m=+809.360419483" lastFinishedPulling="2025-12-11 05:29:15.082337583 +0000 UTC m=+857.499684281" observedRunningTime="2025-12-11 05:29:16.023974468 +0000 UTC m=+858.441321166" watchObservedRunningTime="2025-12-11 05:29:16.028430599 +0000 UTC m=+858.445777297" Dec 11 05:29:16 crc kubenswrapper[4628]: I1211 05:29:16.108523 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-jqz2j" podStartSLOduration=3.753939613 podStartE2EDuration="51.108506104s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:27.676351149 +0000 UTC m=+810.093697847" lastFinishedPulling="2025-12-11 05:29:15.03091764 +0000 UTC m=+857.448264338" observedRunningTime="2025-12-11 05:29:16.078732731 +0000 UTC m=+858.496079429" watchObservedRunningTime="2025-12-11 05:29:16.108506104 +0000 UTC m=+858.525852802" Dec 11 05:29:16 crc kubenswrapper[4628]: E1211 05:29:16.377226 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" podUID="53a3113c-a3d2-42c8-8ab8-b26b448a728a" Dec 11 05:29:16 crc kubenswrapper[4628]: I1211 05:29:16.989819 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx" event={"ID":"bca7bee3-0202-48ba-b0e9-3353f6ab0938","Type":"ContainerStarted","Data":"bfef989a6a5cb3f490b8ec8080caf70dcb58307c3a1e6b99a8057e07cfa3e71a"} Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.017807 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt" event={"ID":"dbdd3dcf-94cf-4b1e-9918-5d8efbe60360","Type":"ContainerStarted","Data":"1d169d9efc67e74f7542d59bd5c0a4ee84a388eee17400103d3cd130268fe4e4"} Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.028624 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7" event={"ID":"2f46589d-ec5b-48e9-8f64-741a6a5b3e84","Type":"ContainerStarted","Data":"07a4ad0b7a652a2d06a68d34f5ca906c0a54792eb435c4166bbc11ef8d9d2021"} Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.029374 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7" Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.032301 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7" Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.033654 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95cq4" event={"ID":"490133ce-06a3-43b2-8ad4-1e233d714c56","Type":"ContainerStarted","Data":"f6aca1759bd81c0af66c99e10c23a322c19efa7d441ca86b9996ad57f6a56ed4"} Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.039537 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg" event={"ID":"9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe","Type":"ContainerStarted","Data":"7602ca0b1c2e3ed93f482f4e29840ca26171034401b8c9c82ab5114713324c76"} Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.039812 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg" Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.042200 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98" event={"ID":"c2e9f8e4-3eda-4227-ad4a-8f8641f88612","Type":"ContainerStarted","Data":"750ff36035a132e61ea0ff068a474507702bbb2f53a7652b50234a8630c456e2"} Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.043800 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98" Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.047288 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98" Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.050903 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-whlx7" podStartSLOduration=3.502897709 podStartE2EDuration="52.050894459s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:26.727971534 +0000 UTC m=+809.145318222" lastFinishedPulling="2025-12-11 05:29:15.275968274 +0000 UTC m=+857.693314972" observedRunningTime="2025-12-11 05:29:17.048187435 +0000 UTC m=+859.465534133" watchObservedRunningTime="2025-12-11 05:29:17.050894459 +0000 UTC m=+859.468241157" Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.082524 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-cjb98" podStartSLOduration=4.090489646 podStartE2EDuration="52.082506632s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:27.538657986 +0000 UTC m=+809.956004674" lastFinishedPulling="2025-12-11 05:29:15.530674962 +0000 UTC m=+857.948021660" observedRunningTime="2025-12-11 05:29:17.079623693 +0000 UTC m=+859.496970391" watchObservedRunningTime="2025-12-11 05:29:17.082506632 +0000 UTC m=+859.499853330" Dec 11 05:29:17 crc kubenswrapper[4628]: E1211 05:29:17.102929 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" podUID="2b786de1-276f-470c-b60a-e93596dd9e47" Dec 11 05:29:17 crc kubenswrapper[4628]: E1211 05:29:17.106026 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs" podUID="d92dcd20-90f9-4499-bae5-f117cf41b4d5" Dec 11 05:29:17 crc kubenswrapper[4628]: I1211 05:29:17.131576 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg" podStartSLOduration=4.732591506 podStartE2EDuration="52.131556589s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:28.167294139 +0000 UTC m=+810.584640837" lastFinishedPulling="2025-12-11 05:29:15.566259222 +0000 UTC m=+857.983605920" observedRunningTime="2025-12-11 05:29:17.128291161 +0000 UTC m=+859.545637859" watchObservedRunningTime="2025-12-11 05:29:17.131556589 +0000 UTC m=+859.548903287" Dec 11 05:29:18 crc kubenswrapper[4628]: I1211 05:29:18.050423 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" event={"ID":"2b786de1-276f-470c-b60a-e93596dd9e47","Type":"ContainerStarted","Data":"1f7e185eda7c5114796e2bd806b771574a8c178f6d8c826753c7ee09252f8cb4"} Dec 11 05:29:18 crc kubenswrapper[4628]: I1211 05:29:18.053101 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt" event={"ID":"dbdd3dcf-94cf-4b1e-9918-5d8efbe60360","Type":"ContainerStarted","Data":"b61918ff18570fc1d2dd1c51f8246cd68b7b3fb97dc0f011c6a742f6529bbf99"} Dec 11 05:29:18 crc kubenswrapper[4628]: I1211 05:29:18.053336 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt" Dec 11 05:29:18 crc kubenswrapper[4628]: E1211 05:29:18.053995 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" podUID="2b786de1-276f-470c-b60a-e93596dd9e47" Dec 11 05:29:18 crc kubenswrapper[4628]: I1211 05:29:18.069162 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs" event={"ID":"d92dcd20-90f9-4499-bae5-f117cf41b4d5","Type":"ContainerStarted","Data":"b9c06d6b973d3c148783db4ff2916dfb61c837662b957b8cfae446af28ebfe09"} Dec 11 05:29:18 crc kubenswrapper[4628]: I1211 05:29:18.082190 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt" podStartSLOduration=5.31465284 podStartE2EDuration="53.08214864s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:27.989147915 +0000 UTC m=+810.406494613" lastFinishedPulling="2025-12-11 05:29:15.756643715 +0000 UTC m=+858.173990413" observedRunningTime="2025-12-11 05:29:18.077191185 +0000 UTC m=+860.494537883" watchObservedRunningTime="2025-12-11 05:29:18.08214864 +0000 UTC m=+860.499495348" Dec 11 05:29:19 crc kubenswrapper[4628]: I1211 05:29:19.077475 4628 generic.go:334] "Generic (PLEG): container finished" podID="490133ce-06a3-43b2-8ad4-1e233d714c56" containerID="f6aca1759bd81c0af66c99e10c23a322c19efa7d441ca86b9996ad57f6a56ed4" exitCode=0 Dec 11 05:29:19 crc kubenswrapper[4628]: I1211 05:29:19.077546 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95cq4" event={"ID":"490133ce-06a3-43b2-8ad4-1e233d714c56","Type":"ContainerDied","Data":"f6aca1759bd81c0af66c99e10c23a322c19efa7d441ca86b9996ad57f6a56ed4"} Dec 11 05:29:19 crc kubenswrapper[4628]: I1211 05:29:19.079609 4628 generic.go:334] "Generic (PLEG): container finished" podID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" containerID="f18ba36bbaed416e5f4c55f3e23b99bebe604dafd70ea7cb9fdeeceb4c98c86f" exitCode=0 Dec 11 05:29:19 crc kubenswrapper[4628]: I1211 05:29:19.079701 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmsjn" event={"ID":"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce","Type":"ContainerDied","Data":"f18ba36bbaed416e5f4c55f3e23b99bebe604dafd70ea7cb9fdeeceb4c98c86f"} Dec 11 05:29:21 crc kubenswrapper[4628]: I1211 05:29:21.411977 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:29:21 crc kubenswrapper[4628]: I1211 05:29:21.412281 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:29:22 crc kubenswrapper[4628]: I1211 05:29:22.452142 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-2nftm" podUID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" containerName="registry-server" probeResult="failure" output=< Dec 11 05:29:22 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 05:29:22 crc kubenswrapper[4628]: > Dec 11 05:29:22 crc kubenswrapper[4628]: E1211 05:29:22.886240 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r" podUID="c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.111537 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7" event={"ID":"d0e69cfa-5f08-4640-b9f8-b7c27ef8660f","Type":"ContainerStarted","Data":"afbd4fd634102752be740c20529939d1608ac745c586d5328f128565370445b1"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.111635 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.112796 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc" event={"ID":"938faeea-3048-4d4a-8f3d-e22b31c73f47","Type":"ContainerStarted","Data":"269039b51e704a519e2bc42ce793ef12aff38c3245cee53a98d8f1269c7dfd09"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.114253 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" event={"ID":"ae8e31fb-df50-4c43-af56-9c01af34f181","Type":"ContainerStarted","Data":"39cdc77df43bae899d958ac20e25bdaa01f4103f73639ff8fac7a39d8395f1c4"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.114280 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" event={"ID":"ae8e31fb-df50-4c43-af56-9c01af34f181","Type":"ContainerStarted","Data":"bb320ba5da4e7391365e9156213d371bbddf6b19b6911323094838577235fa51"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.114350 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.115990 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95cq4" event={"ID":"490133ce-06a3-43b2-8ad4-1e233d714c56","Type":"ContainerStarted","Data":"a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.117672 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs" event={"ID":"d92dcd20-90f9-4499-bae5-f117cf41b4d5","Type":"ContainerStarted","Data":"1d4395d40092aa3e1860c442b23875480cd93f237e49c8ade711bfeeaddb9626"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.117879 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.119055 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl" event={"ID":"f041b1fa-37ae-46fc-b6b0-301da06c1ff7","Type":"ContainerStarted","Data":"ec528d54fddc41138f2a0aaaeaba67154b9856daa68681cea81fede21e2f0ed3"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.119193 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.120605 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk" event={"ID":"ee29a0b0-46f9-45f6-b356-dde79504d5cc","Type":"ContainerStarted","Data":"7b0fac50a9fbd87afcbba17388b2553a33879ef228b51dca0ec9d465b9536baf"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.121558 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r" event={"ID":"c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2","Type":"ContainerStarted","Data":"7d6aebe26c5ad01f5063344880990d877548d405d20cad31f580a377a615bab9"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.123980 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmsjn" event={"ID":"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce","Type":"ContainerStarted","Data":"cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.125746 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vftnq" event={"ID":"a7d3410e-df7b-4de8-aa0f-4c6de9e251e7","Type":"ContainerStarted","Data":"b7288463c7812c817a2572fd6af4ba456fd13a159a235eafacdd52feb481dc3f"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.126126 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vftnq" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.127734 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" event={"ID":"3112c087-1436-4f0a-8b0c-6000b07a0f77","Type":"ContainerStarted","Data":"d2311042cb06a8127ab7778d6a842db911367bf458b84c2b0b2a75eca32909be"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.127768 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" event={"ID":"3112c087-1436-4f0a-8b0c-6000b07a0f77","Type":"ContainerStarted","Data":"66eb854894689738e9b161bc5975b668c4f4e3cd5b402b8ed1ee26d675e134d1"} Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.127873 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.147283 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7" podStartSLOduration=3.351244681 podStartE2EDuration="58.147266422s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:27.669020252 +0000 UTC m=+810.086366950" lastFinishedPulling="2025-12-11 05:29:22.465041993 +0000 UTC m=+864.882388691" observedRunningTime="2025-12-11 05:29:23.145828433 +0000 UTC m=+865.563175131" watchObservedRunningTime="2025-12-11 05:29:23.147266422 +0000 UTC m=+865.564613120" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.188359 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" podStartSLOduration=44.541837412 podStartE2EDuration="58.188342022s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:29:08.739558369 +0000 UTC m=+851.156905067" lastFinishedPulling="2025-12-11 05:29:22.386062979 +0000 UTC m=+864.803409677" observedRunningTime="2025-12-11 05:29:23.183335336 +0000 UTC m=+865.600682034" watchObservedRunningTime="2025-12-11 05:29:23.188342022 +0000 UTC m=+865.605688720" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.268381 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l85kc" podStartSLOduration=2.939050837 podStartE2EDuration="57.268368096s" podCreationTimestamp="2025-12-11 05:28:26 +0000 UTC" firstStartedPulling="2025-12-11 05:28:28.117104646 +0000 UTC m=+810.534451354" lastFinishedPulling="2025-12-11 05:29:22.446421915 +0000 UTC m=+864.863768613" observedRunningTime="2025-12-11 05:29:23.268344755 +0000 UTC m=+865.685691453" watchObservedRunningTime="2025-12-11 05:29:23.268368096 +0000 UTC m=+865.685714794" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.271240 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl" podStartSLOduration=3.443456336 podStartE2EDuration="58.271228734s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:27.605712394 +0000 UTC m=+810.023059092" lastFinishedPulling="2025-12-11 05:29:22.433484792 +0000 UTC m=+864.850831490" observedRunningTime="2025-12-11 05:29:23.232077246 +0000 UTC m=+865.649423934" watchObservedRunningTime="2025-12-11 05:29:23.271228734 +0000 UTC m=+865.688575432" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.362650 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" podStartSLOduration=44.678350226 podStartE2EDuration="58.362634697s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:29:08.749101229 +0000 UTC m=+851.166447927" lastFinishedPulling="2025-12-11 05:29:22.4333857 +0000 UTC m=+864.850732398" observedRunningTime="2025-12-11 05:29:23.358501124 +0000 UTC m=+865.775847822" watchObservedRunningTime="2025-12-11 05:29:23.362634697 +0000 UTC m=+865.779981395" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.456122 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tmsjn" podStartSLOduration=34.061923387 podStartE2EDuration="42.456107987s" podCreationTimestamp="2025-12-11 05:28:41 +0000 UTC" firstStartedPulling="2025-12-11 05:29:14.06196693 +0000 UTC m=+856.479313628" lastFinishedPulling="2025-12-11 05:29:22.45615153 +0000 UTC m=+864.873498228" observedRunningTime="2025-12-11 05:29:23.450031891 +0000 UTC m=+865.867378589" watchObservedRunningTime="2025-12-11 05:29:23.456107987 +0000 UTC m=+865.873454685" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.483423 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-95cq4" podStartSLOduration=29.131641847 podStartE2EDuration="37.483407251s" podCreationTimestamp="2025-12-11 05:28:46 +0000 UTC" firstStartedPulling="2025-12-11 05:29:14.062376281 +0000 UTC m=+856.479722979" lastFinishedPulling="2025-12-11 05:29:22.414141685 +0000 UTC m=+864.831488383" observedRunningTime="2025-12-11 05:29:23.479049222 +0000 UTC m=+865.896395920" watchObservedRunningTime="2025-12-11 05:29:23.483407251 +0000 UTC m=+865.900753949" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.505409 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vftnq" podStartSLOduration=4.141617453 podStartE2EDuration="58.505393531s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:27.949632329 +0000 UTC m=+810.366979027" lastFinishedPulling="2025-12-11 05:29:22.313408397 +0000 UTC m=+864.730755105" observedRunningTime="2025-12-11 05:29:23.504920968 +0000 UTC m=+865.922267666" watchObservedRunningTime="2025-12-11 05:29:23.505393531 +0000 UTC m=+865.922740229" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.524515 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs" podStartSLOduration=3.7950179950000003 podStartE2EDuration="58.524497642s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:27.670961544 +0000 UTC m=+810.088308242" lastFinishedPulling="2025-12-11 05:29:22.400441181 +0000 UTC m=+864.817787889" observedRunningTime="2025-12-11 05:29:23.519435494 +0000 UTC m=+865.936782202" watchObservedRunningTime="2025-12-11 05:29:23.524497642 +0000 UTC m=+865.941844340" Dec 11 05:29:23 crc kubenswrapper[4628]: I1211 05:29:23.546548 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk" podStartSLOduration=8.961084895 podStartE2EDuration="58.546529653s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:28.114521616 +0000 UTC m=+810.531868314" lastFinishedPulling="2025-12-11 05:29:17.699966374 +0000 UTC m=+860.117313072" observedRunningTime="2025-12-11 05:29:23.541909017 +0000 UTC m=+865.959255715" watchObservedRunningTime="2025-12-11 05:29:23.546529653 +0000 UTC m=+865.963876351" Dec 11 05:29:24 crc kubenswrapper[4628]: I1211 05:29:24.134922 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r" event={"ID":"c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2","Type":"ContainerStarted","Data":"2f2057ea06f2fadacc8aafae5f13af0753e16a066528a43a9f6806eab2590d7f"} Dec 11 05:29:24 crc kubenswrapper[4628]: I1211 05:29:24.135746 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk" Dec 11 05:29:24 crc kubenswrapper[4628]: I1211 05:29:24.175946 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r" podStartSLOduration=2.816218777 podStartE2EDuration="59.175930592s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:27.219986423 +0000 UTC m=+809.637333121" lastFinishedPulling="2025-12-11 05:29:23.579698238 +0000 UTC m=+865.997044936" observedRunningTime="2025-12-11 05:29:24.172602921 +0000 UTC m=+866.589949619" watchObservedRunningTime="2025-12-11 05:29:24.175930592 +0000 UTC m=+866.593277290" Dec 11 05:29:25 crc kubenswrapper[4628]: I1211 05:29:25.140119 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r" Dec 11 05:29:26 crc kubenswrapper[4628]: I1211 05:29:26.099303 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-bqcdn" Dec 11 05:29:26 crc kubenswrapper[4628]: I1211 05:29:26.112738 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-v2lxt" Dec 11 05:29:26 crc kubenswrapper[4628]: I1211 05:29:26.544782 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-nc2xx" Dec 11 05:29:26 crc kubenswrapper[4628]: I1211 05:29:26.680164 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-tnvqg" Dec 11 05:29:27 crc kubenswrapper[4628]: I1211 05:29:27.223727 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:29:27 crc kubenswrapper[4628]: I1211 05:29:27.223791 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:29:28 crc kubenswrapper[4628]: I1211 05:29:28.277798 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-95cq4" podUID="490133ce-06a3-43b2-8ad4-1e233d714c56" containerName="registry-server" probeResult="failure" output=< Dec 11 05:29:28 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 05:29:28 crc kubenswrapper[4628]: > Dec 11 05:29:28 crc kubenswrapper[4628]: I1211 05:29:28.902630 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7546d6d447-f9qwn" Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.186445 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" event={"ID":"232e8d69-426a-4259-93ab-1ebb4fa89a17","Type":"ContainerStarted","Data":"dbadf33c580addadd57d1ca1f782d18e7b75d24560426d0a9be65c19563bc7bc"} Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.187288 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.188691 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" event={"ID":"2b9ef50b-db17-4df4-a936-5a02a25f61d7","Type":"ContainerStarted","Data":"7becc8bca4f325cde3a02d4d2f230cd5cdffb1192093bdd2e2a46ddf44e03c55"} Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.188838 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.190133 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" event={"ID":"2b786de1-276f-470c-b60a-e93596dd9e47","Type":"ContainerStarted","Data":"fab5e08d24974eeb08c9e1fd98b981406f7a2c71e43f466bb7c41605f95aa525"} Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.190352 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.211653 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" podStartSLOduration=4.061039131 podStartE2EDuration="1m6.211637516s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:28.125224785 +0000 UTC m=+810.542571483" lastFinishedPulling="2025-12-11 05:29:30.27582317 +0000 UTC m=+872.693169868" observedRunningTime="2025-12-11 05:29:31.210335151 +0000 UTC m=+873.627681849" watchObservedRunningTime="2025-12-11 05:29:31.211637516 +0000 UTC m=+873.628984214" Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.227272 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" podStartSLOduration=4.077500464 podStartE2EDuration="1m6.227254463s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:28.210643858 +0000 UTC m=+810.627990546" lastFinishedPulling="2025-12-11 05:29:30.360397857 +0000 UTC m=+872.777744545" observedRunningTime="2025-12-11 05:29:31.226298377 +0000 UTC m=+873.643645075" watchObservedRunningTime="2025-12-11 05:29:31.227254463 +0000 UTC m=+873.644601161" Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.454423 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.478067 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" podStartSLOduration=4.329227739 podStartE2EDuration="1m6.478049593s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:28.210760081 +0000 UTC m=+810.628106779" lastFinishedPulling="2025-12-11 05:29:30.359581935 +0000 UTC m=+872.776928633" observedRunningTime="2025-12-11 05:29:31.252212243 +0000 UTC m=+873.669558991" watchObservedRunningTime="2025-12-11 05:29:31.478049593 +0000 UTC m=+873.895396301" Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.505980 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.576627 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-6ff94" Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.964099 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:29:31 crc kubenswrapper[4628]: I1211 05:29:31.964138 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:29:32 crc kubenswrapper[4628]: I1211 05:29:32.166596 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879f9w8m4" Dec 11 05:29:32 crc kubenswrapper[4628]: I1211 05:29:32.188516 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2nftm"] Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.002606 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tmsjn" podUID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" containerName="registry-server" probeResult="failure" output=< Dec 11 05:29:33 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 05:29:33 crc kubenswrapper[4628]: > Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.202982 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" event={"ID":"53a3113c-a3d2-42c8-8ab8-b26b448a728a","Type":"ContainerStarted","Data":"d1cb48b06a3eeeaf9780eebf95eeec11570689822340af522d44f6f7f142a939"} Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.203133 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2nftm" podUID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" containerName="registry-server" containerID="cri-o://ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f" gracePeriod=2 Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.203340 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.224303 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" podStartSLOduration=4.114621598 podStartE2EDuration="1m8.224283476s" podCreationTimestamp="2025-12-11 05:28:25 +0000 UTC" firstStartedPulling="2025-12-11 05:28:28.206315372 +0000 UTC m=+810.623662070" lastFinishedPulling="2025-12-11 05:29:32.31597725 +0000 UTC m=+874.733323948" observedRunningTime="2025-12-11 05:29:33.219462364 +0000 UTC m=+875.636809062" watchObservedRunningTime="2025-12-11 05:29:33.224283476 +0000 UTC m=+875.641630174" Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.566075 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.704702 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbkdh\" (UniqueName: \"kubernetes.io/projected/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-kube-api-access-lbkdh\") pod \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\" (UID: \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\") " Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.704772 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-utilities\") pod \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\" (UID: \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\") " Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.704947 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-catalog-content\") pod \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\" (UID: \"d0150af6-b59b-4a9f-89da-1d6cdcd79a44\") " Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.705676 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-utilities" (OuterVolumeSpecName: "utilities") pod "d0150af6-b59b-4a9f-89da-1d6cdcd79a44" (UID: "d0150af6-b59b-4a9f-89da-1d6cdcd79a44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.719683 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-kube-api-access-lbkdh" (OuterVolumeSpecName: "kube-api-access-lbkdh") pod "d0150af6-b59b-4a9f-89da-1d6cdcd79a44" (UID: "d0150af6-b59b-4a9f-89da-1d6cdcd79a44"). InnerVolumeSpecName "kube-api-access-lbkdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.734735 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0150af6-b59b-4a9f-89da-1d6cdcd79a44" (UID: "d0150af6-b59b-4a9f-89da-1d6cdcd79a44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.807178 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbkdh\" (UniqueName: \"kubernetes.io/projected/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-kube-api-access-lbkdh\") on node \"crc\" DevicePath \"\"" Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.807224 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:29:33 crc kubenswrapper[4628]: I1211 05:29:33.807237 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0150af6-b59b-4a9f-89da-1d6cdcd79a44-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.212671 4628 generic.go:334] "Generic (PLEG): container finished" podID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" containerID="ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f" exitCode=0 Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.212733 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nftm" event={"ID":"d0150af6-b59b-4a9f-89da-1d6cdcd79a44","Type":"ContainerDied","Data":"ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f"} Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.212780 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2nftm" event={"ID":"d0150af6-b59b-4a9f-89da-1d6cdcd79a44","Type":"ContainerDied","Data":"ef219b79617848f03753067cb2cb64fba14f10be55cb183bec225c1a7e254b8b"} Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.212802 4628 scope.go:117] "RemoveContainer" containerID="ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f" Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.212828 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2nftm" Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.242588 4628 scope.go:117] "RemoveContainer" containerID="c380ef520a31624689a27a1ecd0902d602da6b78c6e241a29f681dda2bfe8167" Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.246599 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2nftm"] Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.254343 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2nftm"] Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.259120 4628 scope.go:117] "RemoveContainer" containerID="936f3514df17a97987e66081133003ab791ebb88b94593b8a6b252f0e33b1533" Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.294081 4628 scope.go:117] "RemoveContainer" containerID="ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f" Dec 11 05:29:34 crc kubenswrapper[4628]: E1211 05:29:34.294489 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f\": container with ID starting with ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f not found: ID does not exist" containerID="ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f" Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.294519 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f"} err="failed to get container status \"ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f\": rpc error: code = NotFound desc = could not find container \"ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f\": container with ID starting with ce8c70e0e1f1d3c8dbfa81d9ef754a59f86370c8ee4b8d858ca39a094dc80b8f not found: ID does not exist" Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.294540 4628 scope.go:117] "RemoveContainer" containerID="c380ef520a31624689a27a1ecd0902d602da6b78c6e241a29f681dda2bfe8167" Dec 11 05:29:34 crc kubenswrapper[4628]: E1211 05:29:34.294818 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c380ef520a31624689a27a1ecd0902d602da6b78c6e241a29f681dda2bfe8167\": container with ID starting with c380ef520a31624689a27a1ecd0902d602da6b78c6e241a29f681dda2bfe8167 not found: ID does not exist" containerID="c380ef520a31624689a27a1ecd0902d602da6b78c6e241a29f681dda2bfe8167" Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.294959 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c380ef520a31624689a27a1ecd0902d602da6b78c6e241a29f681dda2bfe8167"} err="failed to get container status \"c380ef520a31624689a27a1ecd0902d602da6b78c6e241a29f681dda2bfe8167\": rpc error: code = NotFound desc = could not find container \"c380ef520a31624689a27a1ecd0902d602da6b78c6e241a29f681dda2bfe8167\": container with ID starting with c380ef520a31624689a27a1ecd0902d602da6b78c6e241a29f681dda2bfe8167 not found: ID does not exist" Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.294996 4628 scope.go:117] "RemoveContainer" containerID="936f3514df17a97987e66081133003ab791ebb88b94593b8a6b252f0e33b1533" Dec 11 05:29:34 crc kubenswrapper[4628]: E1211 05:29:34.295209 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"936f3514df17a97987e66081133003ab791ebb88b94593b8a6b252f0e33b1533\": container with ID starting with 936f3514df17a97987e66081133003ab791ebb88b94593b8a6b252f0e33b1533 not found: ID does not exist" containerID="936f3514df17a97987e66081133003ab791ebb88b94593b8a6b252f0e33b1533" Dec 11 05:29:34 crc kubenswrapper[4628]: I1211 05:29:34.295232 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"936f3514df17a97987e66081133003ab791ebb88b94593b8a6b252f0e33b1533"} err="failed to get container status \"936f3514df17a97987e66081133003ab791ebb88b94593b8a6b252f0e33b1533\": rpc error: code = NotFound desc = could not find container \"936f3514df17a97987e66081133003ab791ebb88b94593b8a6b252f0e33b1533\": container with ID starting with 936f3514df17a97987e66081133003ab791ebb88b94593b8a6b252f0e33b1533 not found: ID does not exist" Dec 11 05:29:35 crc kubenswrapper[4628]: I1211 05:29:35.899454 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" path="/var/lib/kubelet/pods/d0150af6-b59b-4a9f-89da-1d6cdcd79a44/volumes" Dec 11 05:29:36 crc kubenswrapper[4628]: I1211 05:29:36.005126 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-967d97867-x4p8r" Dec 11 05:29:36 crc kubenswrapper[4628]: I1211 05:29:36.132356 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-9bcfl" Dec 11 05:29:36 crc kubenswrapper[4628]: I1211 05:29:36.133114 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-z6dn7" Dec 11 05:29:36 crc kubenswrapper[4628]: I1211 05:29:36.156697 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vcz8d" Dec 11 05:29:36 crc kubenswrapper[4628]: I1211 05:29:36.218102 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w5xrs" Dec 11 05:29:36 crc kubenswrapper[4628]: I1211 05:29:36.251787 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-vftnq" Dec 11 05:29:36 crc kubenswrapper[4628]: I1211 05:29:36.573596 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-qxsdk" Dec 11 05:29:36 crc kubenswrapper[4628]: I1211 05:29:36.581827 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-zvjrq" Dec 11 05:29:36 crc kubenswrapper[4628]: I1211 05:29:36.631211 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-2q4b9" Dec 11 05:29:37 crc kubenswrapper[4628]: I1211 05:29:37.263729 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:29:37 crc kubenswrapper[4628]: I1211 05:29:37.305828 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:29:37 crc kubenswrapper[4628]: I1211 05:29:37.596688 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95cq4"] Dec 11 05:29:39 crc kubenswrapper[4628]: I1211 05:29:39.251550 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-95cq4" podUID="490133ce-06a3-43b2-8ad4-1e233d714c56" containerName="registry-server" containerID="cri-o://a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a" gracePeriod=2 Dec 11 05:29:39 crc kubenswrapper[4628]: I1211 05:29:39.726303 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:29:39 crc kubenswrapper[4628]: I1211 05:29:39.750964 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8ntv\" (UniqueName: \"kubernetes.io/projected/490133ce-06a3-43b2-8ad4-1e233d714c56-kube-api-access-s8ntv\") pod \"490133ce-06a3-43b2-8ad4-1e233d714c56\" (UID: \"490133ce-06a3-43b2-8ad4-1e233d714c56\") " Dec 11 05:29:39 crc kubenswrapper[4628]: I1211 05:29:39.751098 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490133ce-06a3-43b2-8ad4-1e233d714c56-catalog-content\") pod \"490133ce-06a3-43b2-8ad4-1e233d714c56\" (UID: \"490133ce-06a3-43b2-8ad4-1e233d714c56\") " Dec 11 05:29:39 crc kubenswrapper[4628]: I1211 05:29:39.751165 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490133ce-06a3-43b2-8ad4-1e233d714c56-utilities\") pod \"490133ce-06a3-43b2-8ad4-1e233d714c56\" (UID: \"490133ce-06a3-43b2-8ad4-1e233d714c56\") " Dec 11 05:29:39 crc kubenswrapper[4628]: I1211 05:29:39.752314 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/490133ce-06a3-43b2-8ad4-1e233d714c56-utilities" (OuterVolumeSpecName: "utilities") pod "490133ce-06a3-43b2-8ad4-1e233d714c56" (UID: "490133ce-06a3-43b2-8ad4-1e233d714c56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:29:39 crc kubenswrapper[4628]: I1211 05:29:39.755493 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490133ce-06a3-43b2-8ad4-1e233d714c56-kube-api-access-s8ntv" (OuterVolumeSpecName: "kube-api-access-s8ntv") pod "490133ce-06a3-43b2-8ad4-1e233d714c56" (UID: "490133ce-06a3-43b2-8ad4-1e233d714c56"). InnerVolumeSpecName "kube-api-access-s8ntv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:29:39 crc kubenswrapper[4628]: I1211 05:29:39.815556 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/490133ce-06a3-43b2-8ad4-1e233d714c56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "490133ce-06a3-43b2-8ad4-1e233d714c56" (UID: "490133ce-06a3-43b2-8ad4-1e233d714c56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:29:39 crc kubenswrapper[4628]: I1211 05:29:39.854075 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/490133ce-06a3-43b2-8ad4-1e233d714c56-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:29:39 crc kubenswrapper[4628]: I1211 05:29:39.854176 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/490133ce-06a3-43b2-8ad4-1e233d714c56-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:29:39 crc kubenswrapper[4628]: I1211 05:29:39.854188 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8ntv\" (UniqueName: \"kubernetes.io/projected/490133ce-06a3-43b2-8ad4-1e233d714c56-kube-api-access-s8ntv\") on node \"crc\" DevicePath \"\"" Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.274905 4628 generic.go:334] "Generic (PLEG): container finished" podID="490133ce-06a3-43b2-8ad4-1e233d714c56" containerID="a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a" exitCode=0 Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.275064 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95cq4" Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.276050 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95cq4" event={"ID":"490133ce-06a3-43b2-8ad4-1e233d714c56","Type":"ContainerDied","Data":"a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a"} Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.276133 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95cq4" event={"ID":"490133ce-06a3-43b2-8ad4-1e233d714c56","Type":"ContainerDied","Data":"37457f348c20be4962932a279e06c61cee5eb5114343633951792640314d3f31"} Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.276167 4628 scope.go:117] "RemoveContainer" containerID="a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a" Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.301327 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95cq4"] Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.319148 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-95cq4"] Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.326402 4628 scope.go:117] "RemoveContainer" containerID="f6aca1759bd81c0af66c99e10c23a322c19efa7d441ca86b9996ad57f6a56ed4" Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.354974 4628 scope.go:117] "RemoveContainer" containerID="dbbb24fe82848cebb2f5069273ce82a0e6008b0a8bc6f4798fa38ab15be7ad95" Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.376388 4628 scope.go:117] "RemoveContainer" containerID="a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a" Dec 11 05:29:40 crc kubenswrapper[4628]: E1211 05:29:40.376992 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a\": container with ID starting with a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a not found: ID does not exist" containerID="a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a" Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.377045 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a"} err="failed to get container status \"a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a\": rpc error: code = NotFound desc = could not find container \"a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a\": container with ID starting with a5ffb80b34954fc9c0df62a5c0b924d08378983134aacb315955cf473bff888a not found: ID does not exist" Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.377081 4628 scope.go:117] "RemoveContainer" containerID="f6aca1759bd81c0af66c99e10c23a322c19efa7d441ca86b9996ad57f6a56ed4" Dec 11 05:29:40 crc kubenswrapper[4628]: E1211 05:29:40.377615 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6aca1759bd81c0af66c99e10c23a322c19efa7d441ca86b9996ad57f6a56ed4\": container with ID starting with f6aca1759bd81c0af66c99e10c23a322c19efa7d441ca86b9996ad57f6a56ed4 not found: ID does not exist" containerID="f6aca1759bd81c0af66c99e10c23a322c19efa7d441ca86b9996ad57f6a56ed4" Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.377682 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6aca1759bd81c0af66c99e10c23a322c19efa7d441ca86b9996ad57f6a56ed4"} err="failed to get container status \"f6aca1759bd81c0af66c99e10c23a322c19efa7d441ca86b9996ad57f6a56ed4\": rpc error: code = NotFound desc = could not find container \"f6aca1759bd81c0af66c99e10c23a322c19efa7d441ca86b9996ad57f6a56ed4\": container with ID starting with f6aca1759bd81c0af66c99e10c23a322c19efa7d441ca86b9996ad57f6a56ed4 not found: ID does not exist" Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.377715 4628 scope.go:117] "RemoveContainer" containerID="dbbb24fe82848cebb2f5069273ce82a0e6008b0a8bc6f4798fa38ab15be7ad95" Dec 11 05:29:40 crc kubenswrapper[4628]: E1211 05:29:40.378092 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbbb24fe82848cebb2f5069273ce82a0e6008b0a8bc6f4798fa38ab15be7ad95\": container with ID starting with dbbb24fe82848cebb2f5069273ce82a0e6008b0a8bc6f4798fa38ab15be7ad95 not found: ID does not exist" containerID="dbbb24fe82848cebb2f5069273ce82a0e6008b0a8bc6f4798fa38ab15be7ad95" Dec 11 05:29:40 crc kubenswrapper[4628]: I1211 05:29:40.378192 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbbb24fe82848cebb2f5069273ce82a0e6008b0a8bc6f4798fa38ab15be7ad95"} err="failed to get container status \"dbbb24fe82848cebb2f5069273ce82a0e6008b0a8bc6f4798fa38ab15be7ad95\": rpc error: code = NotFound desc = could not find container \"dbbb24fe82848cebb2f5069273ce82a0e6008b0a8bc6f4798fa38ab15be7ad95\": container with ID starting with dbbb24fe82848cebb2f5069273ce82a0e6008b0a8bc6f4798fa38ab15be7ad95 not found: ID does not exist" Dec 11 05:29:41 crc kubenswrapper[4628]: I1211 05:29:41.906607 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490133ce-06a3-43b2-8ad4-1e233d714c56" path="/var/lib/kubelet/pods/490133ce-06a3-43b2-8ad4-1e233d714c56/volumes" Dec 11 05:29:42 crc kubenswrapper[4628]: I1211 05:29:42.027201 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:29:42 crc kubenswrapper[4628]: I1211 05:29:42.074022 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:29:42 crc kubenswrapper[4628]: I1211 05:29:42.991167 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tmsjn"] Dec 11 05:29:43 crc kubenswrapper[4628]: I1211 05:29:43.306240 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tmsjn" podUID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" containerName="registry-server" containerID="cri-o://cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c" gracePeriod=2 Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.317370 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.317753 4628 generic.go:334] "Generic (PLEG): container finished" podID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" containerID="cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c" exitCode=0 Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.317800 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmsjn" event={"ID":"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce","Type":"ContainerDied","Data":"cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c"} Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.317895 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tmsjn" event={"ID":"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce","Type":"ContainerDied","Data":"e2a960a38a48354c53eb8e6bc01470dcf889b5d6b04501c2a30a47794070b294"} Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.317930 4628 scope.go:117] "RemoveContainer" containerID="cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.341595 4628 scope.go:117] "RemoveContainer" containerID="f18ba36bbaed416e5f4c55f3e23b99bebe604dafd70ea7cb9fdeeceb4c98c86f" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.388724 4628 scope.go:117] "RemoveContainer" containerID="9c2cdcf43a8ec7bdfb079ae80f8215df8427eb126a8633beea12f67b2c0af338" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.404028 4628 scope.go:117] "RemoveContainer" containerID="cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c" Dec 11 05:29:44 crc kubenswrapper[4628]: E1211 05:29:44.404422 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c\": container with ID starting with cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c not found: ID does not exist" containerID="cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.404456 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c"} err="failed to get container status \"cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c\": rpc error: code = NotFound desc = could not find container \"cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c\": container with ID starting with cc75887c54704cec47a5cd007670d546e02506eddfce1e3556b2e15e94dca59c not found: ID does not exist" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.404480 4628 scope.go:117] "RemoveContainer" containerID="f18ba36bbaed416e5f4c55f3e23b99bebe604dafd70ea7cb9fdeeceb4c98c86f" Dec 11 05:29:44 crc kubenswrapper[4628]: E1211 05:29:44.404692 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f18ba36bbaed416e5f4c55f3e23b99bebe604dafd70ea7cb9fdeeceb4c98c86f\": container with ID starting with f18ba36bbaed416e5f4c55f3e23b99bebe604dafd70ea7cb9fdeeceb4c98c86f not found: ID does not exist" containerID="f18ba36bbaed416e5f4c55f3e23b99bebe604dafd70ea7cb9fdeeceb4c98c86f" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.404723 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f18ba36bbaed416e5f4c55f3e23b99bebe604dafd70ea7cb9fdeeceb4c98c86f"} err="failed to get container status \"f18ba36bbaed416e5f4c55f3e23b99bebe604dafd70ea7cb9fdeeceb4c98c86f\": rpc error: code = NotFound desc = could not find container \"f18ba36bbaed416e5f4c55f3e23b99bebe604dafd70ea7cb9fdeeceb4c98c86f\": container with ID starting with f18ba36bbaed416e5f4c55f3e23b99bebe604dafd70ea7cb9fdeeceb4c98c86f not found: ID does not exist" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.404743 4628 scope.go:117] "RemoveContainer" containerID="9c2cdcf43a8ec7bdfb079ae80f8215df8427eb126a8633beea12f67b2c0af338" Dec 11 05:29:44 crc kubenswrapper[4628]: E1211 05:29:44.405013 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c2cdcf43a8ec7bdfb079ae80f8215df8427eb126a8633beea12f67b2c0af338\": container with ID starting with 9c2cdcf43a8ec7bdfb079ae80f8215df8427eb126a8633beea12f67b2c0af338 not found: ID does not exist" containerID="9c2cdcf43a8ec7bdfb079ae80f8215df8427eb126a8633beea12f67b2c0af338" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.405068 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2cdcf43a8ec7bdfb079ae80f8215df8427eb126a8633beea12f67b2c0af338"} err="failed to get container status \"9c2cdcf43a8ec7bdfb079ae80f8215df8427eb126a8633beea12f67b2c0af338\": rpc error: code = NotFound desc = could not find container \"9c2cdcf43a8ec7bdfb079ae80f8215df8427eb126a8633beea12f67b2c0af338\": container with ID starting with 9c2cdcf43a8ec7bdfb079ae80f8215df8427eb126a8633beea12f67b2c0af338 not found: ID does not exist" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.469802 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh88h\" (UniqueName: \"kubernetes.io/projected/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-kube-api-access-vh88h\") pod \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\" (UID: \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\") " Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.470598 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-catalog-content\") pod \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\" (UID: \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\") " Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.470785 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-utilities\") pod \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\" (UID: \"68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce\") " Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.471989 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-utilities" (OuterVolumeSpecName: "utilities") pod "68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" (UID: "68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.477199 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-kube-api-access-vh88h" (OuterVolumeSpecName: "kube-api-access-vh88h") pod "68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" (UID: "68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce"). InnerVolumeSpecName "kube-api-access-vh88h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.572769 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh88h\" (UniqueName: \"kubernetes.io/projected/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-kube-api-access-vh88h\") on node \"crc\" DevicePath \"\"" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.572801 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.594704 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" (UID: "68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:29:44 crc kubenswrapper[4628]: I1211 05:29:44.674454 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:29:45 crc kubenswrapper[4628]: I1211 05:29:45.326743 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tmsjn" Dec 11 05:29:45 crc kubenswrapper[4628]: I1211 05:29:45.368946 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tmsjn"] Dec 11 05:29:45 crc kubenswrapper[4628]: I1211 05:29:45.375430 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tmsjn"] Dec 11 05:29:45 crc kubenswrapper[4628]: I1211 05:29:45.909771 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" path="/var/lib/kubelet/pods/68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce/volumes" Dec 11 05:29:46 crc kubenswrapper[4628]: I1211 05:29:46.668991 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-75944c9b7-l2wf4" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.146878 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9"] Dec 11 05:30:00 crc kubenswrapper[4628]: E1211 05:30:00.147595 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerName="registry-server" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147608 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerName="registry-server" Dec 11 05:30:00 crc kubenswrapper[4628]: E1211 05:30:00.147629 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" containerName="extract-utilities" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147637 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" containerName="extract-utilities" Dec 11 05:30:00 crc kubenswrapper[4628]: E1211 05:30:00.147645 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" containerName="extract-content" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147651 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" containerName="extract-content" Dec 11 05:30:00 crc kubenswrapper[4628]: E1211 05:30:00.147658 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" containerName="extract-content" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147663 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" containerName="extract-content" Dec 11 05:30:00 crc kubenswrapper[4628]: E1211 05:30:00.147676 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerName="extract-utilities" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147682 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerName="extract-utilities" Dec 11 05:30:00 crc kubenswrapper[4628]: E1211 05:30:00.147693 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" containerName="registry-server" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147698 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" containerName="registry-server" Dec 11 05:30:00 crc kubenswrapper[4628]: E1211 05:30:00.147718 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490133ce-06a3-43b2-8ad4-1e233d714c56" containerName="extract-utilities" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147724 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="490133ce-06a3-43b2-8ad4-1e233d714c56" containerName="extract-utilities" Dec 11 05:30:00 crc kubenswrapper[4628]: E1211 05:30:00.147736 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490133ce-06a3-43b2-8ad4-1e233d714c56" containerName="extract-content" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147742 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="490133ce-06a3-43b2-8ad4-1e233d714c56" containerName="extract-content" Dec 11 05:30:00 crc kubenswrapper[4628]: E1211 05:30:00.147750 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490133ce-06a3-43b2-8ad4-1e233d714c56" containerName="registry-server" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147755 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="490133ce-06a3-43b2-8ad4-1e233d714c56" containerName="registry-server" Dec 11 05:30:00 crc kubenswrapper[4628]: E1211 05:30:00.147766 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerName="extract-content" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147771 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerName="extract-content" Dec 11 05:30:00 crc kubenswrapper[4628]: E1211 05:30:00.147780 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" containerName="extract-utilities" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147786 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" containerName="extract-utilities" Dec 11 05:30:00 crc kubenswrapper[4628]: E1211 05:30:00.147793 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" containerName="registry-server" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147799 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" containerName="registry-server" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147936 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf3bfbad-c97e-47e9-9390-233f50d34f49" containerName="registry-server" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147951 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="68cf3e3a-61d8-4d9f-b78b-7e026b3ae0ce" containerName="registry-server" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147963 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0150af6-b59b-4a9f-89da-1d6cdcd79a44" containerName="registry-server" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.147973 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="490133ce-06a3-43b2-8ad4-1e233d714c56" containerName="registry-server" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.148387 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.151570 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.151808 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.157269 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9"] Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.320082 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2008d0e-e957-41d4-946a-c057dfe90bfb-config-volume\") pod \"collect-profiles-29423850-snfj9\" (UID: \"b2008d0e-e957-41d4-946a-c057dfe90bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.320361 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2008d0e-e957-41d4-946a-c057dfe90bfb-secret-volume\") pod \"collect-profiles-29423850-snfj9\" (UID: \"b2008d0e-e957-41d4-946a-c057dfe90bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.320417 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gllxz\" (UniqueName: \"kubernetes.io/projected/b2008d0e-e957-41d4-946a-c057dfe90bfb-kube-api-access-gllxz\") pod \"collect-profiles-29423850-snfj9\" (UID: \"b2008d0e-e957-41d4-946a-c057dfe90bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.421396 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2008d0e-e957-41d4-946a-c057dfe90bfb-config-volume\") pod \"collect-profiles-29423850-snfj9\" (UID: \"b2008d0e-e957-41d4-946a-c057dfe90bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.421470 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2008d0e-e957-41d4-946a-c057dfe90bfb-secret-volume\") pod \"collect-profiles-29423850-snfj9\" (UID: \"b2008d0e-e957-41d4-946a-c057dfe90bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.421523 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gllxz\" (UniqueName: \"kubernetes.io/projected/b2008d0e-e957-41d4-946a-c057dfe90bfb-kube-api-access-gllxz\") pod \"collect-profiles-29423850-snfj9\" (UID: \"b2008d0e-e957-41d4-946a-c057dfe90bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.422484 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2008d0e-e957-41d4-946a-c057dfe90bfb-config-volume\") pod \"collect-profiles-29423850-snfj9\" (UID: \"b2008d0e-e957-41d4-946a-c057dfe90bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.427836 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2008d0e-e957-41d4-946a-c057dfe90bfb-secret-volume\") pod \"collect-profiles-29423850-snfj9\" (UID: \"b2008d0e-e957-41d4-946a-c057dfe90bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.438750 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gllxz\" (UniqueName: \"kubernetes.io/projected/b2008d0e-e957-41d4-946a-c057dfe90bfb-kube-api-access-gllxz\") pod \"collect-profiles-29423850-snfj9\" (UID: \"b2008d0e-e957-41d4-946a-c057dfe90bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.470542 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:00 crc kubenswrapper[4628]: I1211 05:30:00.902778 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9"] Dec 11 05:30:00 crc kubenswrapper[4628]: W1211 05:30:00.913139 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2008d0e_e957_41d4_946a_c057dfe90bfb.slice/crio-d940aec76f8d13df8c9846eaedc9f193df2e609aea11ffd60e1df3d5c03995fe WatchSource:0}: Error finding container d940aec76f8d13df8c9846eaedc9f193df2e609aea11ffd60e1df3d5c03995fe: Status 404 returned error can't find the container with id d940aec76f8d13df8c9846eaedc9f193df2e609aea11ffd60e1df3d5c03995fe Dec 11 05:30:01 crc kubenswrapper[4628]: E1211 05:30:01.393209 4628 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2008d0e_e957_41d4_946a_c057dfe90bfb.slice/crio-conmon-9b254b0f254fc13cfc8e09ef8d6532083d6a1d6f2b7e816fabf2ca493e14f7d2.scope\": RecentStats: unable to find data in memory cache]" Dec 11 05:30:01 crc kubenswrapper[4628]: I1211 05:30:01.427052 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:30:01 crc kubenswrapper[4628]: I1211 05:30:01.427439 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:30:01 crc kubenswrapper[4628]: I1211 05:30:01.476923 4628 generic.go:334] "Generic (PLEG): container finished" podID="b2008d0e-e957-41d4-946a-c057dfe90bfb" containerID="9b254b0f254fc13cfc8e09ef8d6532083d6a1d6f2b7e816fabf2ca493e14f7d2" exitCode=0 Dec 11 05:30:01 crc kubenswrapper[4628]: I1211 05:30:01.476971 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" event={"ID":"b2008d0e-e957-41d4-946a-c057dfe90bfb","Type":"ContainerDied","Data":"9b254b0f254fc13cfc8e09ef8d6532083d6a1d6f2b7e816fabf2ca493e14f7d2"} Dec 11 05:30:01 crc kubenswrapper[4628]: I1211 05:30:01.476999 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" event={"ID":"b2008d0e-e957-41d4-946a-c057dfe90bfb","Type":"ContainerStarted","Data":"d940aec76f8d13df8c9846eaedc9f193df2e609aea11ffd60e1df3d5c03995fe"} Dec 11 05:30:02 crc kubenswrapper[4628]: I1211 05:30:02.816426 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:02 crc kubenswrapper[4628]: I1211 05:30:02.888494 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gllxz\" (UniqueName: \"kubernetes.io/projected/b2008d0e-e957-41d4-946a-c057dfe90bfb-kube-api-access-gllxz\") pod \"b2008d0e-e957-41d4-946a-c057dfe90bfb\" (UID: \"b2008d0e-e957-41d4-946a-c057dfe90bfb\") " Dec 11 05:30:02 crc kubenswrapper[4628]: I1211 05:30:02.888564 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2008d0e-e957-41d4-946a-c057dfe90bfb-config-volume\") pod \"b2008d0e-e957-41d4-946a-c057dfe90bfb\" (UID: \"b2008d0e-e957-41d4-946a-c057dfe90bfb\") " Dec 11 05:30:02 crc kubenswrapper[4628]: I1211 05:30:02.888597 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2008d0e-e957-41d4-946a-c057dfe90bfb-secret-volume\") pod \"b2008d0e-e957-41d4-946a-c057dfe90bfb\" (UID: \"b2008d0e-e957-41d4-946a-c057dfe90bfb\") " Dec 11 05:30:02 crc kubenswrapper[4628]: I1211 05:30:02.889476 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2008d0e-e957-41d4-946a-c057dfe90bfb-config-volume" (OuterVolumeSpecName: "config-volume") pod "b2008d0e-e957-41d4-946a-c057dfe90bfb" (UID: "b2008d0e-e957-41d4-946a-c057dfe90bfb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:30:02 crc kubenswrapper[4628]: I1211 05:30:02.896794 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2008d0e-e957-41d4-946a-c057dfe90bfb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b2008d0e-e957-41d4-946a-c057dfe90bfb" (UID: "b2008d0e-e957-41d4-946a-c057dfe90bfb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:30:02 crc kubenswrapper[4628]: I1211 05:30:02.904903 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2008d0e-e957-41d4-946a-c057dfe90bfb-kube-api-access-gllxz" (OuterVolumeSpecName: "kube-api-access-gllxz") pod "b2008d0e-e957-41d4-946a-c057dfe90bfb" (UID: "b2008d0e-e957-41d4-946a-c057dfe90bfb"). InnerVolumeSpecName "kube-api-access-gllxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:30:02 crc kubenswrapper[4628]: I1211 05:30:02.989741 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gllxz\" (UniqueName: \"kubernetes.io/projected/b2008d0e-e957-41d4-946a-c057dfe90bfb-kube-api-access-gllxz\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:02 crc kubenswrapper[4628]: I1211 05:30:02.989781 4628 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2008d0e-e957-41d4-946a-c057dfe90bfb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:02 crc kubenswrapper[4628]: I1211 05:30:02.989794 4628 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2008d0e-e957-41d4-946a-c057dfe90bfb-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.443821 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-stwfj"] Dec 11 05:30:03 crc kubenswrapper[4628]: E1211 05:30:03.444382 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2008d0e-e957-41d4-946a-c057dfe90bfb" containerName="collect-profiles" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.444400 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2008d0e-e957-41d4-946a-c057dfe90bfb" containerName="collect-profiles" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.444535 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2008d0e-e957-41d4-946a-c057dfe90bfb" containerName="collect-profiles" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.445244 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.449386 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.449897 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.450248 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.453885 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-fxfkf" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.462900 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-stwfj"] Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.491516 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" event={"ID":"b2008d0e-e957-41d4-946a-c057dfe90bfb","Type":"ContainerDied","Data":"d940aec76f8d13df8c9846eaedc9f193df2e609aea11ffd60e1df3d5c03995fe"} Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.491575 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d940aec76f8d13df8c9846eaedc9f193df2e609aea11ffd60e1df3d5c03995fe" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.491595 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.525973 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fgwt6"] Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.533376 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.538291 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.548233 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fgwt6"] Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.598437 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d-config\") pod \"dnsmasq-dns-675f4bcbfc-stwfj\" (UID: \"b14da2d6-24da-40bb-8e6a-ec3dc566ad8d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.598520 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slx8j\" (UniqueName: \"kubernetes.io/projected/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d-kube-api-access-slx8j\") pod \"dnsmasq-dns-675f4bcbfc-stwfj\" (UID: \"b14da2d6-24da-40bb-8e6a-ec3dc566ad8d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.700258 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67c605f5-a2d1-4a12-9244-107134d07f1c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fgwt6\" (UID: \"67c605f5-a2d1-4a12-9244-107134d07f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.700734 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d-config\") pod \"dnsmasq-dns-675f4bcbfc-stwfj\" (UID: \"b14da2d6-24da-40bb-8e6a-ec3dc566ad8d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.700939 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c605f5-a2d1-4a12-9244-107134d07f1c-config\") pod \"dnsmasq-dns-78dd6ddcc-fgwt6\" (UID: \"67c605f5-a2d1-4a12-9244-107134d07f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.701099 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vxkx\" (UniqueName: \"kubernetes.io/projected/67c605f5-a2d1-4a12-9244-107134d07f1c-kube-api-access-2vxkx\") pod \"dnsmasq-dns-78dd6ddcc-fgwt6\" (UID: \"67c605f5-a2d1-4a12-9244-107134d07f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.701311 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slx8j\" (UniqueName: \"kubernetes.io/projected/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d-kube-api-access-slx8j\") pod \"dnsmasq-dns-675f4bcbfc-stwfj\" (UID: \"b14da2d6-24da-40bb-8e6a-ec3dc566ad8d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.701596 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d-config\") pod \"dnsmasq-dns-675f4bcbfc-stwfj\" (UID: \"b14da2d6-24da-40bb-8e6a-ec3dc566ad8d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.731820 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slx8j\" (UniqueName: \"kubernetes.io/projected/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d-kube-api-access-slx8j\") pod \"dnsmasq-dns-675f4bcbfc-stwfj\" (UID: \"b14da2d6-24da-40bb-8e6a-ec3dc566ad8d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.761181 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.802744 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c605f5-a2d1-4a12-9244-107134d07f1c-config\") pod \"dnsmasq-dns-78dd6ddcc-fgwt6\" (UID: \"67c605f5-a2d1-4a12-9244-107134d07f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.802798 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxkx\" (UniqueName: \"kubernetes.io/projected/67c605f5-a2d1-4a12-9244-107134d07f1c-kube-api-access-2vxkx\") pod \"dnsmasq-dns-78dd6ddcc-fgwt6\" (UID: \"67c605f5-a2d1-4a12-9244-107134d07f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.802947 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67c605f5-a2d1-4a12-9244-107134d07f1c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fgwt6\" (UID: \"67c605f5-a2d1-4a12-9244-107134d07f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.804414 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67c605f5-a2d1-4a12-9244-107134d07f1c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-fgwt6\" (UID: \"67c605f5-a2d1-4a12-9244-107134d07f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.805246 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c605f5-a2d1-4a12-9244-107134d07f1c-config\") pod \"dnsmasq-dns-78dd6ddcc-fgwt6\" (UID: \"67c605f5-a2d1-4a12-9244-107134d07f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.825035 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vxkx\" (UniqueName: \"kubernetes.io/projected/67c605f5-a2d1-4a12-9244-107134d07f1c-kube-api-access-2vxkx\") pod \"dnsmasq-dns-78dd6ddcc-fgwt6\" (UID: \"67c605f5-a2d1-4a12-9244-107134d07f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:03 crc kubenswrapper[4628]: I1211 05:30:03.848025 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:04 crc kubenswrapper[4628]: I1211 05:30:04.165820 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fgwt6"] Dec 11 05:30:04 crc kubenswrapper[4628]: W1211 05:30:04.169183 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67c605f5_a2d1_4a12_9244_107134d07f1c.slice/crio-edd2b4ad04e52e6e2397b4070144d7e8dbb1e1953ec1d28cb75557d1296b6a9d WatchSource:0}: Error finding container edd2b4ad04e52e6e2397b4070144d7e8dbb1e1953ec1d28cb75557d1296b6a9d: Status 404 returned error can't find the container with id edd2b4ad04e52e6e2397b4070144d7e8dbb1e1953ec1d28cb75557d1296b6a9d Dec 11 05:30:04 crc kubenswrapper[4628]: I1211 05:30:04.224582 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-stwfj"] Dec 11 05:30:04 crc kubenswrapper[4628]: W1211 05:30:04.229431 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb14da2d6_24da_40bb_8e6a_ec3dc566ad8d.slice/crio-5d2b867cc74696bacbcfa29a7401e33bcf7bbbc2522ed63873c7ca93ac5bb9b5 WatchSource:0}: Error finding container 5d2b867cc74696bacbcfa29a7401e33bcf7bbbc2522ed63873c7ca93ac5bb9b5: Status 404 returned error can't find the container with id 5d2b867cc74696bacbcfa29a7401e33bcf7bbbc2522ed63873c7ca93ac5bb9b5 Dec 11 05:30:04 crc kubenswrapper[4628]: I1211 05:30:04.500426 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" event={"ID":"b14da2d6-24da-40bb-8e6a-ec3dc566ad8d","Type":"ContainerStarted","Data":"5d2b867cc74696bacbcfa29a7401e33bcf7bbbc2522ed63873c7ca93ac5bb9b5"} Dec 11 05:30:04 crc kubenswrapper[4628]: I1211 05:30:04.502529 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" event={"ID":"67c605f5-a2d1-4a12-9244-107134d07f1c","Type":"ContainerStarted","Data":"edd2b4ad04e52e6e2397b4070144d7e8dbb1e1953ec1d28cb75557d1296b6a9d"} Dec 11 05:30:06 crc kubenswrapper[4628]: I1211 05:30:06.790818 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-stwfj"] Dec 11 05:30:06 crc kubenswrapper[4628]: I1211 05:30:06.819252 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bxzmm"] Dec 11 05:30:06 crc kubenswrapper[4628]: I1211 05:30:06.820449 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:06 crc kubenswrapper[4628]: I1211 05:30:06.843368 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bxzmm"] Dec 11 05:30:06 crc kubenswrapper[4628]: I1211 05:30:06.957929 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/534bc26b-a52d-40e7-8ac2-4aa407d070a9-config\") pod \"dnsmasq-dns-666b6646f7-bxzmm\" (UID: \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\") " pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:06 crc kubenswrapper[4628]: I1211 05:30:06.958356 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/534bc26b-a52d-40e7-8ac2-4aa407d070a9-dns-svc\") pod \"dnsmasq-dns-666b6646f7-bxzmm\" (UID: \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\") " pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:06 crc kubenswrapper[4628]: I1211 05:30:06.958739 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hqrl\" (UniqueName: \"kubernetes.io/projected/534bc26b-a52d-40e7-8ac2-4aa407d070a9-kube-api-access-6hqrl\") pod \"dnsmasq-dns-666b6646f7-bxzmm\" (UID: \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\") " pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.059984 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hqrl\" (UniqueName: \"kubernetes.io/projected/534bc26b-a52d-40e7-8ac2-4aa407d070a9-kube-api-access-6hqrl\") pod \"dnsmasq-dns-666b6646f7-bxzmm\" (UID: \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\") " pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.060084 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/534bc26b-a52d-40e7-8ac2-4aa407d070a9-config\") pod \"dnsmasq-dns-666b6646f7-bxzmm\" (UID: \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\") " pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.060113 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/534bc26b-a52d-40e7-8ac2-4aa407d070a9-dns-svc\") pod \"dnsmasq-dns-666b6646f7-bxzmm\" (UID: \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\") " pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.061510 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/534bc26b-a52d-40e7-8ac2-4aa407d070a9-dns-svc\") pod \"dnsmasq-dns-666b6646f7-bxzmm\" (UID: \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\") " pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.062208 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/534bc26b-a52d-40e7-8ac2-4aa407d070a9-config\") pod \"dnsmasq-dns-666b6646f7-bxzmm\" (UID: \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\") " pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.117446 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hqrl\" (UniqueName: \"kubernetes.io/projected/534bc26b-a52d-40e7-8ac2-4aa407d070a9-kube-api-access-6hqrl\") pod \"dnsmasq-dns-666b6646f7-bxzmm\" (UID: \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\") " pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.162529 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.179884 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fgwt6"] Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.211540 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4ms4n"] Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.212660 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.232147 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4ms4n"] Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.365877 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bq6n\" (UniqueName: \"kubernetes.io/projected/72259b41-2e95-4531-a2ff-2939e437253c-kube-api-access-5bq6n\") pod \"dnsmasq-dns-57d769cc4f-4ms4n\" (UID: \"72259b41-2e95-4531-a2ff-2939e437253c\") " pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.366384 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72259b41-2e95-4531-a2ff-2939e437253c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4ms4n\" (UID: \"72259b41-2e95-4531-a2ff-2939e437253c\") " pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.366539 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72259b41-2e95-4531-a2ff-2939e437253c-config\") pod \"dnsmasq-dns-57d769cc4f-4ms4n\" (UID: \"72259b41-2e95-4531-a2ff-2939e437253c\") " pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.470990 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72259b41-2e95-4531-a2ff-2939e437253c-config\") pod \"dnsmasq-dns-57d769cc4f-4ms4n\" (UID: \"72259b41-2e95-4531-a2ff-2939e437253c\") " pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.471264 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bq6n\" (UniqueName: \"kubernetes.io/projected/72259b41-2e95-4531-a2ff-2939e437253c-kube-api-access-5bq6n\") pod \"dnsmasq-dns-57d769cc4f-4ms4n\" (UID: \"72259b41-2e95-4531-a2ff-2939e437253c\") " pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.471316 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72259b41-2e95-4531-a2ff-2939e437253c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4ms4n\" (UID: \"72259b41-2e95-4531-a2ff-2939e437253c\") " pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.472216 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72259b41-2e95-4531-a2ff-2939e437253c-config\") pod \"dnsmasq-dns-57d769cc4f-4ms4n\" (UID: \"72259b41-2e95-4531-a2ff-2939e437253c\") " pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.473720 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72259b41-2e95-4531-a2ff-2939e437253c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4ms4n\" (UID: \"72259b41-2e95-4531-a2ff-2939e437253c\") " pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.510917 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bq6n\" (UniqueName: \"kubernetes.io/projected/72259b41-2e95-4531-a2ff-2939e437253c-kube-api-access-5bq6n\") pod \"dnsmasq-dns-57d769cc4f-4ms4n\" (UID: \"72259b41-2e95-4531-a2ff-2939e437253c\") " pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.542872 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.910674 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bxzmm"] Dec 11 05:30:07 crc kubenswrapper[4628]: W1211 05:30:07.915743 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod534bc26b_a52d_40e7_8ac2_4aa407d070a9.slice/crio-7f06b67a6f47fc73ef35f9a9318d29e11e045a40639362f1ac05c45c8aa2862c WatchSource:0}: Error finding container 7f06b67a6f47fc73ef35f9a9318d29e11e045a40639362f1ac05c45c8aa2862c: Status 404 returned error can't find the container with id 7f06b67a6f47fc73ef35f9a9318d29e11e045a40639362f1ac05c45c8aa2862c Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.964089 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4ms4n"] Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.981347 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.982717 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.984746 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.984808 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lcmx8" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.984975 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.986082 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.989398 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.989523 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.989711 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 11 05:30:07 crc kubenswrapper[4628]: I1211 05:30:07.994900 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.084110 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.084165 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5279d32c-7625-460c-881b-243e69077070-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.084274 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.084319 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5279d32c-7625-460c-881b-243e69077070-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.084417 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.084528 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.084565 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5279d32c-7625-460c-881b-243e69077070-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.084584 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.084607 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8mrt\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-kube-api-access-p8mrt\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.084710 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-config-data\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.084738 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5279d32c-7625-460c-881b-243e69077070-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.186028 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5279d32c-7625-460c-881b-243e69077070-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.186091 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.186113 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5279d32c-7625-460c-881b-243e69077070-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.186167 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.186204 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.186525 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.186564 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5279d32c-7625-460c-881b-243e69077070-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.186605 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8mrt\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-kube-api-access-p8mrt\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.187015 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5279d32c-7625-460c-881b-243e69077070-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.187256 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-config-data\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.187281 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5279d32c-7625-460c-881b-243e69077070-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.187326 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.187379 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.188163 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-config-data\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.189778 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.190978 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5279d32c-7625-460c-881b-243e69077070-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.191758 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.192721 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.196190 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5279d32c-7625-460c-881b-243e69077070-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.203356 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5279d32c-7625-460c-881b-243e69077070-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.207905 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8mrt\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-kube-api-access-p8mrt\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.218094 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.223612 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.311829 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.355750 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.357139 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.359625 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.362739 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.363211 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.363449 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.363546 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.363685 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.363734 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.363514 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hkk9m" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.493514 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.493588 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.493690 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.493740 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvz72\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-kube-api-access-zvz72\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.493830 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.493887 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.493972 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a07218df-1f25-47c4-89dc-2c7ce7f406ac-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.494028 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.494047 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.494061 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.494129 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a07218df-1f25-47c4-89dc-2c7ce7f406ac-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.560358 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" event={"ID":"72259b41-2e95-4531-a2ff-2939e437253c","Type":"ContainerStarted","Data":"b09f065b504e0afc5873f315b7a7b94b143331a0842a628784021dd0e573d97a"} Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.563094 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" event={"ID":"534bc26b-a52d-40e7-8ac2-4aa407d070a9","Type":"ContainerStarted","Data":"7f06b67a6f47fc73ef35f9a9318d29e11e045a40639362f1ac05c45c8aa2862c"} Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.604973 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a07218df-1f25-47c4-89dc-2c7ce7f406ac-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.605033 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.605072 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.605094 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.605115 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvz72\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-kube-api-access-zvz72\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.605152 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.605169 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.605194 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a07218df-1f25-47c4-89dc-2c7ce7f406ac-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.605225 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.605241 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.605256 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.605747 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.608633 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.609176 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.611566 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.631230 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.633938 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.634690 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a07218df-1f25-47c4-89dc-2c7ce7f406ac-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.635669 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.636834 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.639824 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a07218df-1f25-47c4-89dc-2c7ce7f406ac-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.653530 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvz72\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-kube-api-access-zvz72\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.680997 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.712135 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:30:08 crc kubenswrapper[4628]: I1211 05:30:08.887978 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.221081 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 05:30:09 crc kubenswrapper[4628]: W1211 05:30:09.245089 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda07218df_1f25_47c4_89dc_2c7ce7f406ac.slice/crio-1f4cdf3765a685d6f721381dcccdc3d0d1cac66291bf30cbf2286158eb9fb311 WatchSource:0}: Error finding container 1f4cdf3765a685d6f721381dcccdc3d0d1cac66291bf30cbf2286158eb9fb311: Status 404 returned error can't find the container with id 1f4cdf3765a685d6f721381dcccdc3d0d1cac66291bf30cbf2286158eb9fb311 Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.570466 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a07218df-1f25-47c4-89dc-2c7ce7f406ac","Type":"ContainerStarted","Data":"1f4cdf3765a685d6f721381dcccdc3d0d1cac66291bf30cbf2286158eb9fb311"} Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.572335 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5279d32c-7625-460c-881b-243e69077070","Type":"ContainerStarted","Data":"79ba71801b18823c7e6b1221d3ed727d096019e948b7b88cbdc863082e6dbfe9"} Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.656471 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.658054 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.664426 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-5mlnq" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.664824 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.665915 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.666110 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.670281 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.670419 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.824065 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4498a18-7449-45b3-9061-d3ffbfa4be5b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.824360 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4498a18-7449-45b3-9061-d3ffbfa4be5b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.824416 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.824433 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4498a18-7449-45b3-9061-d3ffbfa4be5b-kolla-config\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.824470 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e4498a18-7449-45b3-9061-d3ffbfa4be5b-config-data-default\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.824489 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x5xl\" (UniqueName: \"kubernetes.io/projected/e4498a18-7449-45b3-9061-d3ffbfa4be5b-kube-api-access-6x5xl\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.824512 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4498a18-7449-45b3-9061-d3ffbfa4be5b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.824531 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e4498a18-7449-45b3-9061-d3ffbfa4be5b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.925732 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4498a18-7449-45b3-9061-d3ffbfa4be5b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.925776 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4498a18-7449-45b3-9061-d3ffbfa4be5b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.925826 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.925861 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4498a18-7449-45b3-9061-d3ffbfa4be5b-kolla-config\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.925899 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e4498a18-7449-45b3-9061-d3ffbfa4be5b-config-data-default\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.925927 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x5xl\" (UniqueName: \"kubernetes.io/projected/e4498a18-7449-45b3-9061-d3ffbfa4be5b-kube-api-access-6x5xl\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.925948 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4498a18-7449-45b3-9061-d3ffbfa4be5b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.925967 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e4498a18-7449-45b3-9061-d3ffbfa4be5b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.926423 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.926437 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e4498a18-7449-45b3-9061-d3ffbfa4be5b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.926767 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4498a18-7449-45b3-9061-d3ffbfa4be5b-kolla-config\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.927020 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e4498a18-7449-45b3-9061-d3ffbfa4be5b-config-data-default\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.929301 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4498a18-7449-45b3-9061-d3ffbfa4be5b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.929889 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4498a18-7449-45b3-9061-d3ffbfa4be5b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.948342 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4498a18-7449-45b3-9061-d3ffbfa4be5b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.977750 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x5xl\" (UniqueName: \"kubernetes.io/projected/e4498a18-7449-45b3-9061-d3ffbfa4be5b-kube-api-access-6x5xl\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.979050 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"e4498a18-7449-45b3-9061-d3ffbfa4be5b\") " pod="openstack/openstack-galera-0" Dec 11 05:30:09 crc kubenswrapper[4628]: I1211 05:30:09.986340 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 11 05:30:10 crc kubenswrapper[4628]: I1211 05:30:10.412671 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.085886 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.090093 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.100368 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.100559 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mpwdj" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.100665 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.100765 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.126544 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.259558 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.259620 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.259652 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbz77\" (UniqueName: \"kubernetes.io/projected/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-kube-api-access-pbz77\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.259674 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.259722 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.259743 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.259774 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.259802 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.360744 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.360812 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.360835 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.360911 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.360939 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.360958 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.360996 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.361022 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbz77\" (UniqueName: \"kubernetes.io/projected/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-kube-api-access-pbz77\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.361634 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.361860 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.365511 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.369031 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.370156 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.388253 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.399505 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.422998 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.424382 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.426569 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.427107 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.427414 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-55ppk" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.427508 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbz77\" (UniqueName: \"kubernetes.io/projected/f5879bd1-c58f-4c7a-8158-8be2bd632bf8-kube-api-access-pbz77\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.435887 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.438361 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f5879bd1-c58f-4c7a-8158-8be2bd632bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.564570 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a14ea6c9-f372-463b-8485-a3411412cbe9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.564636 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a14ea6c9-f372-463b-8485-a3411412cbe9-kolla-config\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.564679 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2gvl\" (UniqueName: \"kubernetes.io/projected/a14ea6c9-f372-463b-8485-a3411412cbe9-kube-api-access-n2gvl\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.564710 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14ea6c9-f372-463b-8485-a3411412cbe9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.564728 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a14ea6c9-f372-463b-8485-a3411412cbe9-config-data\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.666492 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a14ea6c9-f372-463b-8485-a3411412cbe9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.666561 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a14ea6c9-f372-463b-8485-a3411412cbe9-kolla-config\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.666603 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2gvl\" (UniqueName: \"kubernetes.io/projected/a14ea6c9-f372-463b-8485-a3411412cbe9-kube-api-access-n2gvl\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.666638 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14ea6c9-f372-463b-8485-a3411412cbe9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.666654 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a14ea6c9-f372-463b-8485-a3411412cbe9-config-data\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.667354 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a14ea6c9-f372-463b-8485-a3411412cbe9-config-data\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.668408 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a14ea6c9-f372-463b-8485-a3411412cbe9-kolla-config\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.703718 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14ea6c9-f372-463b-8485-a3411412cbe9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.704524 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a14ea6c9-f372-463b-8485-a3411412cbe9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.734091 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.748338 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2gvl\" (UniqueName: \"kubernetes.io/projected/a14ea6c9-f372-463b-8485-a3411412cbe9-kube-api-access-n2gvl\") pod \"memcached-0\" (UID: \"a14ea6c9-f372-463b-8485-a3411412cbe9\") " pod="openstack/memcached-0" Dec 11 05:30:11 crc kubenswrapper[4628]: I1211 05:30:11.817127 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 11 05:30:13 crc kubenswrapper[4628]: I1211 05:30:13.260336 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 05:30:13 crc kubenswrapper[4628]: I1211 05:30:13.265577 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 11 05:30:13 crc kubenswrapper[4628]: I1211 05:30:13.268952 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-nfv9x" Dec 11 05:30:13 crc kubenswrapper[4628]: I1211 05:30:13.277726 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 05:30:13 crc kubenswrapper[4628]: I1211 05:30:13.398810 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s828\" (UniqueName: \"kubernetes.io/projected/ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8-kube-api-access-5s828\") pod \"kube-state-metrics-0\" (UID: \"ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8\") " pod="openstack/kube-state-metrics-0" Dec 11 05:30:13 crc kubenswrapper[4628]: I1211 05:30:13.500179 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s828\" (UniqueName: \"kubernetes.io/projected/ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8-kube-api-access-5s828\") pod \"kube-state-metrics-0\" (UID: \"ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8\") " pod="openstack/kube-state-metrics-0" Dec 11 05:30:13 crc kubenswrapper[4628]: I1211 05:30:13.524053 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s828\" (UniqueName: \"kubernetes.io/projected/ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8-kube-api-access-5s828\") pod \"kube-state-metrics-0\" (UID: \"ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8\") " pod="openstack/kube-state-metrics-0" Dec 11 05:30:13 crc kubenswrapper[4628]: I1211 05:30:13.592442 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 11 05:30:16 crc kubenswrapper[4628]: I1211 05:30:16.660956 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e4498a18-7449-45b3-9061-d3ffbfa4be5b","Type":"ContainerStarted","Data":"c4b4fe1c2c57e0c76486024af7fb2e2d67cbfd740208191813295ad50c4f5df7"} Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.547278 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qz7fr"] Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.548404 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.549900 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-fl268" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.551658 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.553053 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.554360 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-gncbg"] Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.557258 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.563919 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qz7fr"] Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.584363 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gncbg"] Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687043 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0885fe4-936a-4a13-b4e5-4aeee593c242-scripts\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687116 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdrmc\" (UniqueName: \"kubernetes.io/projected/d0885fe4-936a-4a13-b4e5-4aeee593c242-kube-api-access-wdrmc\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687197 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-var-log\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687227 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0885fe4-936a-4a13-b4e5-4aeee593c242-combined-ca-bundle\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687268 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-scripts\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687289 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0885fe4-936a-4a13-b4e5-4aeee593c242-var-run-ovn\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687344 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpk8b\" (UniqueName: \"kubernetes.io/projected/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-kube-api-access-rpk8b\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687443 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d0885fe4-936a-4a13-b4e5-4aeee593c242-var-log-ovn\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687509 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0885fe4-936a-4a13-b4e5-4aeee593c242-ovn-controller-tls-certs\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687543 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0885fe4-936a-4a13-b4e5-4aeee593c242-var-run\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687579 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-etc-ovs\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687635 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-var-lib\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.687660 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-var-run\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788479 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpk8b\" (UniqueName: \"kubernetes.io/projected/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-kube-api-access-rpk8b\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788536 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d0885fe4-936a-4a13-b4e5-4aeee593c242-var-log-ovn\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788560 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0885fe4-936a-4a13-b4e5-4aeee593c242-ovn-controller-tls-certs\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788581 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0885fe4-936a-4a13-b4e5-4aeee593c242-var-run\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788601 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-etc-ovs\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788628 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-var-lib\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788646 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-var-run\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788664 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0885fe4-936a-4a13-b4e5-4aeee593c242-scripts\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788688 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdrmc\" (UniqueName: \"kubernetes.io/projected/d0885fe4-936a-4a13-b4e5-4aeee593c242-kube-api-access-wdrmc\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788718 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-var-log\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788736 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0885fe4-936a-4a13-b4e5-4aeee593c242-var-run-ovn\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788760 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0885fe4-936a-4a13-b4e5-4aeee593c242-combined-ca-bundle\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.788775 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-scripts\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.789084 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d0885fe4-936a-4a13-b4e5-4aeee593c242-var-log-ovn\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.789154 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-etc-ovs\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.789203 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-var-log\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.789223 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0885fe4-936a-4a13-b4e5-4aeee593c242-var-run\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.789377 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-var-lib\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.789459 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0885fe4-936a-4a13-b4e5-4aeee593c242-var-run-ovn\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.791101 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0885fe4-936a-4a13-b4e5-4aeee593c242-scripts\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.791602 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-scripts\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.792965 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-var-run\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.795819 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0885fe4-936a-4a13-b4e5-4aeee593c242-ovn-controller-tls-certs\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.813345 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpk8b\" (UniqueName: \"kubernetes.io/projected/7c72a5ae-bbee-41cd-bb23-b9feb77f594d-kube-api-access-rpk8b\") pod \"ovn-controller-ovs-gncbg\" (UID: \"7c72a5ae-bbee-41cd-bb23-b9feb77f594d\") " pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.818768 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0885fe4-936a-4a13-b4e5-4aeee593c242-combined-ca-bundle\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.828496 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdrmc\" (UniqueName: \"kubernetes.io/projected/d0885fe4-936a-4a13-b4e5-4aeee593c242-kube-api-access-wdrmc\") pod \"ovn-controller-qz7fr\" (UID: \"d0885fe4-936a-4a13-b4e5-4aeee593c242\") " pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.869492 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:17 crc kubenswrapper[4628]: I1211 05:30:17.880281 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.155789 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.157616 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.163006 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.163077 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-d2rb7" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.163233 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.163333 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.163417 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.216812 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.227598 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.227647 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16de833-9dc8-4e72-92b8-9374c7ab50bf-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.227673 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16de833-9dc8-4e72-92b8-9374c7ab50bf-config\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.227707 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16de833-9dc8-4e72-92b8-9374c7ab50bf-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.227727 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkfqz\" (UniqueName: \"kubernetes.io/projected/b16de833-9dc8-4e72-92b8-9374c7ab50bf-kube-api-access-jkfqz\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.227781 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b16de833-9dc8-4e72-92b8-9374c7ab50bf-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.227795 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b16de833-9dc8-4e72-92b8-9374c7ab50bf-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.227818 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16de833-9dc8-4e72-92b8-9374c7ab50bf-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.329462 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.329517 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16de833-9dc8-4e72-92b8-9374c7ab50bf-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.329579 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16de833-9dc8-4e72-92b8-9374c7ab50bf-config\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.329614 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16de833-9dc8-4e72-92b8-9374c7ab50bf-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.329635 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkfqz\" (UniqueName: \"kubernetes.io/projected/b16de833-9dc8-4e72-92b8-9374c7ab50bf-kube-api-access-jkfqz\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.329688 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b16de833-9dc8-4e72-92b8-9374c7ab50bf-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.329707 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b16de833-9dc8-4e72-92b8-9374c7ab50bf-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.329734 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16de833-9dc8-4e72-92b8-9374c7ab50bf-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.329816 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.330537 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b16de833-9dc8-4e72-92b8-9374c7ab50bf-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.331144 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b16de833-9dc8-4e72-92b8-9374c7ab50bf-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.331225 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b16de833-9dc8-4e72-92b8-9374c7ab50bf-config\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.338784 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16de833-9dc8-4e72-92b8-9374c7ab50bf-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.342061 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.345945 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.347475 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b16de833-9dc8-4e72-92b8-9374c7ab50bf-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.350052 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b16de833-9dc8-4e72-92b8-9374c7ab50bf-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.351133 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.351318 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.351435 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-xpcph" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.351548 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.355979 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.367598 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkfqz\" (UniqueName: \"kubernetes.io/projected/b16de833-9dc8-4e72-92b8-9374c7ab50bf-kube-api-access-jkfqz\") pod \"ovsdbserver-nb-0\" (UID: \"b16de833-9dc8-4e72-92b8-9374c7ab50bf\") " pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.397651 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.431217 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.431294 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b45a8a8a-00cb-482a-bfc5-149e693949c1-config\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.431324 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b45a8a8a-00cb-482a-bfc5-149e693949c1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.431409 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b45a8a8a-00cb-482a-bfc5-149e693949c1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.431452 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dt8m\" (UniqueName: \"kubernetes.io/projected/b45a8a8a-00cb-482a-bfc5-149e693949c1-kube-api-access-2dt8m\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.431573 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b45a8a8a-00cb-482a-bfc5-149e693949c1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.431788 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b45a8a8a-00cb-482a-bfc5-149e693949c1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.431946 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b45a8a8a-00cb-482a-bfc5-149e693949c1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.478603 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.533732 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b45a8a8a-00cb-482a-bfc5-149e693949c1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.533815 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b45a8a8a-00cb-482a-bfc5-149e693949c1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.533878 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.533912 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b45a8a8a-00cb-482a-bfc5-149e693949c1-config\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.533937 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b45a8a8a-00cb-482a-bfc5-149e693949c1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.533988 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b45a8a8a-00cb-482a-bfc5-149e693949c1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.534017 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dt8m\" (UniqueName: \"kubernetes.io/projected/b45a8a8a-00cb-482a-bfc5-149e693949c1-kube-api-access-2dt8m\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.534052 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b45a8a8a-00cb-482a-bfc5-149e693949c1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.534557 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b45a8a8a-00cb-482a-bfc5-149e693949c1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.534630 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.535372 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b45a8a8a-00cb-482a-bfc5-149e693949c1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.535442 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b45a8a8a-00cb-482a-bfc5-149e693949c1-config\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.538909 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b45a8a8a-00cb-482a-bfc5-149e693949c1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.543667 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b45a8a8a-00cb-482a-bfc5-149e693949c1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.556759 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dt8m\" (UniqueName: \"kubernetes.io/projected/b45a8a8a-00cb-482a-bfc5-149e693949c1-kube-api-access-2dt8m\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.559968 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b45a8a8a-00cb-482a-bfc5-149e693949c1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.573685 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b45a8a8a-00cb-482a-bfc5-149e693949c1\") " pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:20 crc kubenswrapper[4628]: I1211 05:30:20.709048 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:21 crc kubenswrapper[4628]: I1211 05:30:21.932168 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 11 05:30:25 crc kubenswrapper[4628]: I1211 05:30:25.736933 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a14ea6c9-f372-463b-8485-a3411412cbe9","Type":"ContainerStarted","Data":"a628b724285a57ca35a8fe8b6fc983eef3fc45707e5cc9aff921f389f1ee7982"} Dec 11 05:30:31 crc kubenswrapper[4628]: I1211 05:30:31.426413 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:30:31 crc kubenswrapper[4628]: I1211 05:30:31.426725 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:30:31 crc kubenswrapper[4628]: E1211 05:30:31.678378 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 11 05:30:31 crc kubenswrapper[4628]: E1211 05:30:31.678820 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p8mrt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(5279d32c-7625-460c-881b-243e69077070): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:30:31 crc kubenswrapper[4628]: E1211 05:30:31.680013 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="5279d32c-7625-460c-881b-243e69077070" Dec 11 05:30:31 crc kubenswrapper[4628]: I1211 05:30:31.986537 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 11 05:30:34 crc kubenswrapper[4628]: E1211 05:30:34.302284 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 11 05:30:34 crc kubenswrapper[4628]: E1211 05:30:34.302601 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bq6n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-4ms4n_openstack(72259b41-2e95-4531-a2ff-2939e437253c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:30:34 crc kubenswrapper[4628]: E1211 05:30:34.303808 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" podUID="72259b41-2e95-4531-a2ff-2939e437253c" Dec 11 05:30:34 crc kubenswrapper[4628]: E1211 05:30:34.833058 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" podUID="72259b41-2e95-4531-a2ff-2939e437253c" Dec 11 05:30:36 crc kubenswrapper[4628]: E1211 05:30:36.527779 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 11 05:30:36 crc kubenswrapper[4628]: E1211 05:30:36.528449 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2vxkx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-fgwt6_openstack(67c605f5-a2d1-4a12-9244-107134d07f1c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:30:36 crc kubenswrapper[4628]: E1211 05:30:36.533430 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" podUID="67c605f5-a2d1-4a12-9244-107134d07f1c" Dec 11 05:30:36 crc kubenswrapper[4628]: E1211 05:30:36.586049 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 11 05:30:36 crc kubenswrapper[4628]: E1211 05:30:36.586192 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-slx8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-stwfj_openstack(b14da2d6-24da-40bb-8e6a-ec3dc566ad8d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:30:36 crc kubenswrapper[4628]: E1211 05:30:36.587431 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" podUID="b14da2d6-24da-40bb-8e6a-ec3dc566ad8d" Dec 11 05:30:36 crc kubenswrapper[4628]: E1211 05:30:36.602538 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 11 05:30:36 crc kubenswrapper[4628]: E1211 05:30:36.602684 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6hqrl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-bxzmm_openstack(534bc26b-a52d-40e7-8ac2-4aa407d070a9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:30:36 crc kubenswrapper[4628]: E1211 05:30:36.604256 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" podUID="534bc26b-a52d-40e7-8ac2-4aa407d070a9" Dec 11 05:30:36 crc kubenswrapper[4628]: I1211 05:30:36.804389 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qz7fr"] Dec 11 05:30:36 crc kubenswrapper[4628]: I1211 05:30:36.861940 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f5879bd1-c58f-4c7a-8158-8be2bd632bf8","Type":"ContainerStarted","Data":"2689f67e7fe94d3df2ed833cce6730a054c0388626f41022b10d4da4f3d158d8"} Dec 11 05:30:36 crc kubenswrapper[4628]: E1211 05:30:36.862607 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" podUID="534bc26b-a52d-40e7-8ac2-4aa407d070a9" Dec 11 05:30:37 crc kubenswrapper[4628]: I1211 05:30:37.165867 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 05:30:37 crc kubenswrapper[4628]: I1211 05:30:37.292969 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 11 05:30:37 crc kubenswrapper[4628]: I1211 05:30:37.345034 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 11 05:30:37 crc kubenswrapper[4628]: I1211 05:30:37.440177 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gncbg"] Dec 11 05:30:37 crc kubenswrapper[4628]: W1211 05:30:37.855903 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb16de833_9dc8_4e72_92b8_9374c7ab50bf.slice/crio-b90e2194067efacc9ee7a132e7c7e1a4f5e1cb1b243e85bf571f9d1cdef5a561 WatchSource:0}: Error finding container b90e2194067efacc9ee7a132e7c7e1a4f5e1cb1b243e85bf571f9d1cdef5a561: Status 404 returned error can't find the container with id b90e2194067efacc9ee7a132e7c7e1a4f5e1cb1b243e85bf571f9d1cdef5a561 Dec 11 05:30:37 crc kubenswrapper[4628]: W1211 05:30:37.858178 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb45a8a8a_00cb_482a_bfc5_149e693949c1.slice/crio-c16af7474e2e1fa0c0b7f5dbb1dab791c77b57d3a9db919114b8a540ead8b63f WatchSource:0}: Error finding container c16af7474e2e1fa0c0b7f5dbb1dab791c77b57d3a9db919114b8a540ead8b63f: Status 404 returned error can't find the container with id c16af7474e2e1fa0c0b7f5dbb1dab791c77b57d3a9db919114b8a540ead8b63f Dec 11 05:30:37 crc kubenswrapper[4628]: I1211 05:30:37.871224 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" event={"ID":"67c605f5-a2d1-4a12-9244-107134d07f1c","Type":"ContainerDied","Data":"edd2b4ad04e52e6e2397b4070144d7e8dbb1e1953ec1d28cb75557d1296b6a9d"} Dec 11 05:30:37 crc kubenswrapper[4628]: I1211 05:30:37.871270 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edd2b4ad04e52e6e2397b4070144d7e8dbb1e1953ec1d28cb75557d1296b6a9d" Dec 11 05:30:37 crc kubenswrapper[4628]: I1211 05:30:37.879549 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b45a8a8a-00cb-482a-bfc5-149e693949c1","Type":"ContainerStarted","Data":"c16af7474e2e1fa0c0b7f5dbb1dab791c77b57d3a9db919114b8a540ead8b63f"} Dec 11 05:30:37 crc kubenswrapper[4628]: I1211 05:30:37.882253 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b16de833-9dc8-4e72-92b8-9374c7ab50bf","Type":"ContainerStarted","Data":"b90e2194067efacc9ee7a132e7c7e1a4f5e1cb1b243e85bf571f9d1cdef5a561"} Dec 11 05:30:37 crc kubenswrapper[4628]: I1211 05:30:37.934891 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:37 crc kubenswrapper[4628]: I1211 05:30:37.941872 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.007209 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c605f5-a2d1-4a12-9244-107134d07f1c-config\") pod \"67c605f5-a2d1-4a12-9244-107134d07f1c\" (UID: \"67c605f5-a2d1-4a12-9244-107134d07f1c\") " Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.007288 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d-config\") pod \"b14da2d6-24da-40bb-8e6a-ec3dc566ad8d\" (UID: \"b14da2d6-24da-40bb-8e6a-ec3dc566ad8d\") " Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.007383 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slx8j\" (UniqueName: \"kubernetes.io/projected/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d-kube-api-access-slx8j\") pod \"b14da2d6-24da-40bb-8e6a-ec3dc566ad8d\" (UID: \"b14da2d6-24da-40bb-8e6a-ec3dc566ad8d\") " Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.007441 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67c605f5-a2d1-4a12-9244-107134d07f1c-dns-svc\") pod \"67c605f5-a2d1-4a12-9244-107134d07f1c\" (UID: \"67c605f5-a2d1-4a12-9244-107134d07f1c\") " Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.007459 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vxkx\" (UniqueName: \"kubernetes.io/projected/67c605f5-a2d1-4a12-9244-107134d07f1c-kube-api-access-2vxkx\") pod \"67c605f5-a2d1-4a12-9244-107134d07f1c\" (UID: \"67c605f5-a2d1-4a12-9244-107134d07f1c\") " Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.008194 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d-config" (OuterVolumeSpecName: "config") pod "b14da2d6-24da-40bb-8e6a-ec3dc566ad8d" (UID: "b14da2d6-24da-40bb-8e6a-ec3dc566ad8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.008639 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c605f5-a2d1-4a12-9244-107134d07f1c-config" (OuterVolumeSpecName: "config") pod "67c605f5-a2d1-4a12-9244-107134d07f1c" (UID: "67c605f5-a2d1-4a12-9244-107134d07f1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.009267 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c605f5-a2d1-4a12-9244-107134d07f1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67c605f5-a2d1-4a12-9244-107134d07f1c" (UID: "67c605f5-a2d1-4a12-9244-107134d07f1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.009803 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67c605f5-a2d1-4a12-9244-107134d07f1c-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.009821 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.009831 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67c605f5-a2d1-4a12-9244-107134d07f1c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.031833 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d-kube-api-access-slx8j" (OuterVolumeSpecName: "kube-api-access-slx8j") pod "b14da2d6-24da-40bb-8e6a-ec3dc566ad8d" (UID: "b14da2d6-24da-40bb-8e6a-ec3dc566ad8d"). InnerVolumeSpecName "kube-api-access-slx8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.034869 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c605f5-a2d1-4a12-9244-107134d07f1c-kube-api-access-2vxkx" (OuterVolumeSpecName: "kube-api-access-2vxkx") pod "67c605f5-a2d1-4a12-9244-107134d07f1c" (UID: "67c605f5-a2d1-4a12-9244-107134d07f1c"). InnerVolumeSpecName "kube-api-access-2vxkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.111481 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slx8j\" (UniqueName: \"kubernetes.io/projected/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d-kube-api-access-slx8j\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.111513 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vxkx\" (UniqueName: \"kubernetes.io/projected/67c605f5-a2d1-4a12-9244-107134d07f1c-kube-api-access-2vxkx\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.900342 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8","Type":"ContainerStarted","Data":"9dfd751e3ee0674ed5c12bff59eea513dd6a63e500db97130ea4da7e1dd3982c"} Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.901729 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a14ea6c9-f372-463b-8485-a3411412cbe9","Type":"ContainerStarted","Data":"eab0be01219e6f527a89fe8618304be78713c1dc4494066bbeba48925ad9835f"} Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.902579 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.904395 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" event={"ID":"b14da2d6-24da-40bb-8e6a-ec3dc566ad8d","Type":"ContainerDied","Data":"5d2b867cc74696bacbcfa29a7401e33bcf7bbbc2522ed63873c7ca93ac5bb9b5"} Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.904460 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-stwfj" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.915200 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qz7fr" event={"ID":"d0885fe4-936a-4a13-b4e5-4aeee593c242","Type":"ContainerStarted","Data":"f5464800694b1c651c41fba81dc1428b0b9c9749c4dcc889e7de6da046326d70"} Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.916703 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e4498a18-7449-45b3-9061-d3ffbfa4be5b","Type":"ContainerStarted","Data":"57ef236a1a90af234091a626032a53aaa180d72235d926ba9f4eae52494e6446"} Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.941856 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-fgwt6" Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.942685 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gncbg" event={"ID":"7c72a5ae-bbee-41cd-bb23-b9feb77f594d","Type":"ContainerStarted","Data":"78876bb39816e56fd3b80b7dc6edb4469a3d0f5dc6540f098a54ef43a31252ff"} Dec 11 05:30:38 crc kubenswrapper[4628]: I1211 05:30:38.949927 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.128946072 podStartE2EDuration="27.949910743s" podCreationTimestamp="2025-12-11 05:30:11 +0000 UTC" firstStartedPulling="2025-12-11 05:30:25.19412377 +0000 UTC m=+927.611470468" lastFinishedPulling="2025-12-11 05:30:38.015088441 +0000 UTC m=+940.432435139" observedRunningTime="2025-12-11 05:30:38.92565047 +0000 UTC m=+941.342997168" watchObservedRunningTime="2025-12-11 05:30:38.949910743 +0000 UTC m=+941.367257441" Dec 11 05:30:39 crc kubenswrapper[4628]: I1211 05:30:39.071237 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-stwfj"] Dec 11 05:30:39 crc kubenswrapper[4628]: I1211 05:30:39.084109 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-stwfj"] Dec 11 05:30:39 crc kubenswrapper[4628]: I1211 05:30:39.107916 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fgwt6"] Dec 11 05:30:39 crc kubenswrapper[4628]: I1211 05:30:39.114987 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-fgwt6"] Dec 11 05:30:39 crc kubenswrapper[4628]: I1211 05:30:39.908417 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c605f5-a2d1-4a12-9244-107134d07f1c" path="/var/lib/kubelet/pods/67c605f5-a2d1-4a12-9244-107134d07f1c/volumes" Dec 11 05:30:39 crc kubenswrapper[4628]: I1211 05:30:39.908772 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b14da2d6-24da-40bb-8e6a-ec3dc566ad8d" path="/var/lib/kubelet/pods/b14da2d6-24da-40bb-8e6a-ec3dc566ad8d/volumes" Dec 11 05:30:39 crc kubenswrapper[4628]: I1211 05:30:39.949501 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f5879bd1-c58f-4c7a-8158-8be2bd632bf8","Type":"ContainerStarted","Data":"b9ae375f78cc566f1134ab9f14331a0e6d31c08b0876678472a2cf031691f52e"} Dec 11 05:30:39 crc kubenswrapper[4628]: I1211 05:30:39.952745 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a07218df-1f25-47c4-89dc-2c7ce7f406ac","Type":"ContainerStarted","Data":"fbb2f6ff2b4cf940c6af7bddcc8de8efd9bce8d4d8220bccb23d4c9966e7b818"} Dec 11 05:30:39 crc kubenswrapper[4628]: I1211 05:30:39.955742 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5279d32c-7625-460c-881b-243e69077070","Type":"ContainerStarted","Data":"86cdb42df246a58a1bcb275c5570adfa9b9a943b1d21a98085ada9bb6063ed40"} Dec 11 05:30:42 crc kubenswrapper[4628]: E1211 05:30:42.418530 4628 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4498a18_7449_45b3_9061_d3ffbfa4be5b.slice/crio-57ef236a1a90af234091a626032a53aaa180d72235d926ba9f4eae52494e6446.scope\": RecentStats: unable to find data in memory cache]" Dec 11 05:30:42 crc kubenswrapper[4628]: I1211 05:30:42.986327 4628 generic.go:334] "Generic (PLEG): container finished" podID="e4498a18-7449-45b3-9061-d3ffbfa4be5b" containerID="57ef236a1a90af234091a626032a53aaa180d72235d926ba9f4eae52494e6446" exitCode=0 Dec 11 05:30:42 crc kubenswrapper[4628]: I1211 05:30:42.986437 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e4498a18-7449-45b3-9061-d3ffbfa4be5b","Type":"ContainerDied","Data":"57ef236a1a90af234091a626032a53aaa180d72235d926ba9f4eae52494e6446"} Dec 11 05:30:42 crc kubenswrapper[4628]: I1211 05:30:42.996274 4628 generic.go:334] "Generic (PLEG): container finished" podID="f5879bd1-c58f-4c7a-8158-8be2bd632bf8" containerID="b9ae375f78cc566f1134ab9f14331a0e6d31c08b0876678472a2cf031691f52e" exitCode=0 Dec 11 05:30:42 crc kubenswrapper[4628]: I1211 05:30:42.996327 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f5879bd1-c58f-4c7a-8158-8be2bd632bf8","Type":"ContainerDied","Data":"b9ae375f78cc566f1134ab9f14331a0e6d31c08b0876678472a2cf031691f52e"} Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.003343 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qz7fr" event={"ID":"d0885fe4-936a-4a13-b4e5-4aeee593c242","Type":"ContainerStarted","Data":"9e813a707165a1c3e4e9b6722d1521136d47a039afa0d187ea476ed7a35ea6e4"} Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.003956 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-qz7fr" Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.005554 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b16de833-9dc8-4e72-92b8-9374c7ab50bf","Type":"ContainerStarted","Data":"c270fc0a513bad80753e455ef3c6b17159237e2f44a10e3034bcdc7a6c0ea81f"} Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.007483 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gncbg" event={"ID":"7c72a5ae-bbee-41cd-bb23-b9feb77f594d","Type":"ContainerStarted","Data":"e7de4e004dd22b150a6284525ed2c7396a5b1c4d2a1d2ccb16ce8d77a1d90d47"} Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.009946 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e4498a18-7449-45b3-9061-d3ffbfa4be5b","Type":"ContainerStarted","Data":"bf611a31626041d9272ba096aa1562d1d06ecf242f6248d4ca39e337886761c1"} Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.011953 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f5879bd1-c58f-4c7a-8158-8be2bd632bf8","Type":"ContainerStarted","Data":"b70dec030188b654192eea607c760be68c0b7fc6b97999e7fdb6fd506e63e5ee"} Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.013533 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8","Type":"ContainerStarted","Data":"56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921"} Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.013989 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.016146 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b45a8a8a-00cb-482a-bfc5-149e693949c1","Type":"ContainerStarted","Data":"ea9d24da001b7746be1238153ae3e03558dd21740515c8e8c143264e316c2975"} Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.051940 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-qz7fr" podStartSLOduration=21.41558105 podStartE2EDuration="27.051920108s" podCreationTimestamp="2025-12-11 05:30:17 +0000 UTC" firstStartedPulling="2025-12-11 05:30:37.912727429 +0000 UTC m=+940.330074137" lastFinishedPulling="2025-12-11 05:30:43.549066497 +0000 UTC m=+945.966413195" observedRunningTime="2025-12-11 05:30:44.030146561 +0000 UTC m=+946.447493259" watchObservedRunningTime="2025-12-11 05:30:44.051920108 +0000 UTC m=+946.469266816" Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.054239 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=14.716874072 podStartE2EDuration="36.054230159s" podCreationTimestamp="2025-12-11 05:30:08 +0000 UTC" firstStartedPulling="2025-12-11 05:30:16.632339201 +0000 UTC m=+919.049685899" lastFinishedPulling="2025-12-11 05:30:37.969695288 +0000 UTC m=+940.387041986" observedRunningTime="2025-12-11 05:30:44.047628924 +0000 UTC m=+946.464975642" watchObservedRunningTime="2025-12-11 05:30:44.054230159 +0000 UTC m=+946.471576857" Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.079799 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=32.424446929 podStartE2EDuration="34.079783896s" podCreationTimestamp="2025-12-11 05:30:10 +0000 UTC" firstStartedPulling="2025-12-11 05:30:36.392956203 +0000 UTC m=+938.810302901" lastFinishedPulling="2025-12-11 05:30:38.04829317 +0000 UTC m=+940.465639868" observedRunningTime="2025-12-11 05:30:44.078097581 +0000 UTC m=+946.495444279" watchObservedRunningTime="2025-12-11 05:30:44.079783896 +0000 UTC m=+946.497130594" Dec 11 05:30:44 crc kubenswrapper[4628]: I1211 05:30:44.082361 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=25.426694624 podStartE2EDuration="31.082355534s" podCreationTimestamp="2025-12-11 05:30:13 +0000 UTC" firstStartedPulling="2025-12-11 05:30:37.914445335 +0000 UTC m=+940.331792043" lastFinishedPulling="2025-12-11 05:30:43.570106255 +0000 UTC m=+945.987452953" observedRunningTime="2025-12-11 05:30:44.062373035 +0000 UTC m=+946.479719733" watchObservedRunningTime="2025-12-11 05:30:44.082355534 +0000 UTC m=+946.499702232" Dec 11 05:30:45 crc kubenswrapper[4628]: I1211 05:30:45.024465 4628 generic.go:334] "Generic (PLEG): container finished" podID="7c72a5ae-bbee-41cd-bb23-b9feb77f594d" containerID="e7de4e004dd22b150a6284525ed2c7396a5b1c4d2a1d2ccb16ce8d77a1d90d47" exitCode=0 Dec 11 05:30:45 crc kubenswrapper[4628]: I1211 05:30:45.024547 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gncbg" event={"ID":"7c72a5ae-bbee-41cd-bb23-b9feb77f594d","Type":"ContainerDied","Data":"e7de4e004dd22b150a6284525ed2c7396a5b1c4d2a1d2ccb16ce8d77a1d90d47"} Dec 11 05:30:46 crc kubenswrapper[4628]: I1211 05:30:46.037002 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gncbg" event={"ID":"7c72a5ae-bbee-41cd-bb23-b9feb77f594d","Type":"ContainerStarted","Data":"99c14f1dff2f1dde8b3b08d9e189cfa65d1b2635ea99afeb368c9077d541b7a6"} Dec 11 05:30:46 crc kubenswrapper[4628]: I1211 05:30:46.037518 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:46 crc kubenswrapper[4628]: I1211 05:30:46.037532 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gncbg" event={"ID":"7c72a5ae-bbee-41cd-bb23-b9feb77f594d","Type":"ContainerStarted","Data":"7169aa3eea692060a9f7015af75f6a5f4f27e8f25a712de19f3fa0a042bfc9cb"} Dec 11 05:30:46 crc kubenswrapper[4628]: I1211 05:30:46.079436 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-gncbg" podStartSLOduration=23.472514354 podStartE2EDuration="29.079416404s" podCreationTimestamp="2025-12-11 05:30:17 +0000 UTC" firstStartedPulling="2025-12-11 05:30:37.867012878 +0000 UTC m=+940.284359586" lastFinishedPulling="2025-12-11 05:30:43.473914938 +0000 UTC m=+945.891261636" observedRunningTime="2025-12-11 05:30:46.076682541 +0000 UTC m=+948.494029269" watchObservedRunningTime="2025-12-11 05:30:46.079416404 +0000 UTC m=+948.496763122" Dec 11 05:30:46 crc kubenswrapper[4628]: I1211 05:30:46.818232 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 11 05:30:47 crc kubenswrapper[4628]: I1211 05:30:47.042983 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:30:47 crc kubenswrapper[4628]: E1211 05:30:47.571321 4628 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.18:50876->38.102.83.18:36143: write tcp 38.102.83.18:50876->38.102.83.18:36143: write: broken pipe Dec 11 05:30:49 crc kubenswrapper[4628]: I1211 05:30:49.056688 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b45a8a8a-00cb-482a-bfc5-149e693949c1","Type":"ContainerStarted","Data":"9f309cb661240c35166a018c09106b65b23fc88e5631bae1fd617838a5a6f0a6"} Dec 11 05:30:49 crc kubenswrapper[4628]: I1211 05:30:49.058102 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b16de833-9dc8-4e72-92b8-9374c7ab50bf","Type":"ContainerStarted","Data":"9d374d8bd0c411e8329b64bf757c05e9b300b1796d670a292a6435d1f55152aa"} Dec 11 05:30:49 crc kubenswrapper[4628]: I1211 05:30:49.059513 4628 generic.go:334] "Generic (PLEG): container finished" podID="72259b41-2e95-4531-a2ff-2939e437253c" containerID="5f8996d8a55741f8174cbaa5144807a280e7d66ba8e8511457a5325a007488ec" exitCode=0 Dec 11 05:30:49 crc kubenswrapper[4628]: I1211 05:30:49.059539 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" event={"ID":"72259b41-2e95-4531-a2ff-2939e437253c","Type":"ContainerDied","Data":"5f8996d8a55741f8174cbaa5144807a280e7d66ba8e8511457a5325a007488ec"} Dec 11 05:30:49 crc kubenswrapper[4628]: I1211 05:30:49.096680 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=19.828176968 podStartE2EDuration="30.096662166s" podCreationTimestamp="2025-12-11 05:30:19 +0000 UTC" firstStartedPulling="2025-12-11 05:30:37.862820267 +0000 UTC m=+940.280166975" lastFinishedPulling="2025-12-11 05:30:48.131305475 +0000 UTC m=+950.548652173" observedRunningTime="2025-12-11 05:30:49.092728661 +0000 UTC m=+951.510075369" watchObservedRunningTime="2025-12-11 05:30:49.096662166 +0000 UTC m=+951.514008864" Dec 11 05:30:49 crc kubenswrapper[4628]: I1211 05:30:49.135481 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=19.948333741 podStartE2EDuration="30.135462904s" podCreationTimestamp="2025-12-11 05:30:19 +0000 UTC" firstStartedPulling="2025-12-11 05:30:37.864323987 +0000 UTC m=+940.281670685" lastFinishedPulling="2025-12-11 05:30:48.05145315 +0000 UTC m=+950.468799848" observedRunningTime="2025-12-11 05:30:49.133384079 +0000 UTC m=+951.550730777" watchObservedRunningTime="2025-12-11 05:30:49.135462904 +0000 UTC m=+951.552809602" Dec 11 05:30:49 crc kubenswrapper[4628]: I1211 05:30:49.987814 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 11 05:30:49 crc kubenswrapper[4628]: I1211 05:30:49.988212 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 11 05:30:50 crc kubenswrapper[4628]: I1211 05:30:50.069672 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" event={"ID":"72259b41-2e95-4531-a2ff-2939e437253c","Type":"ContainerStarted","Data":"50ef0e9377a23d7f2adb8af12c0fd32e76aa5ede5aa6c12737ce551c61461f01"} Dec 11 05:30:50 crc kubenswrapper[4628]: I1211 05:30:50.071302 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:50 crc kubenswrapper[4628]: I1211 05:30:50.095550 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" podStartSLOduration=2.547283557 podStartE2EDuration="43.095523234s" podCreationTimestamp="2025-12-11 05:30:07 +0000 UTC" firstStartedPulling="2025-12-11 05:30:07.976686495 +0000 UTC m=+910.394033193" lastFinishedPulling="2025-12-11 05:30:48.524926162 +0000 UTC m=+950.942272870" observedRunningTime="2025-12-11 05:30:50.090416058 +0000 UTC m=+952.507762756" watchObservedRunningTime="2025-12-11 05:30:50.095523234 +0000 UTC m=+952.512869942" Dec 11 05:30:50 crc kubenswrapper[4628]: I1211 05:30:50.113076 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 11 05:30:50 crc kubenswrapper[4628]: I1211 05:30:50.218289 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 11 05:30:50 crc kubenswrapper[4628]: I1211 05:30:50.478957 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:50 crc kubenswrapper[4628]: I1211 05:30:50.479021 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:50 crc kubenswrapper[4628]: I1211 05:30:50.522536 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:50 crc kubenswrapper[4628]: I1211 05:30:50.710118 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:50 crc kubenswrapper[4628]: I1211 05:30:50.710184 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:50 crc kubenswrapper[4628]: I1211 05:30:50.752601 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.129260 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.131802 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.351967 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bxzmm"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.395326 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xv5gk"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.400229 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.402827 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.423451 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xv5gk"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.453980 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ac9c-account-create-update-9jpmx"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.455213 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ac9c-account-create-update-9jpmx" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.457146 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.470860 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ac9c-account-create-update-9jpmx"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.510632 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-xv5gk\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.510708 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-xv5gk\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.510727 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxzhg\" (UniqueName: \"kubernetes.io/projected/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-kube-api-access-kxzhg\") pod \"dnsmasq-dns-7f896c8c65-xv5gk\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.510750 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-config\") pod \"dnsmasq-dns-7f896c8c65-xv5gk\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.530671 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-k98qz"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.532954 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k98qz" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.546817 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-k98qz"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.606980 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-szj8g"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.607959 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.617802 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvckd\" (UniqueName: \"kubernetes.io/projected/a41b81d7-fa6c-4daa-b06d-df1105c0e566-kube-api-access-nvckd\") pod \"keystone-ac9c-account-create-update-9jpmx\" (UID: \"a41b81d7-fa6c-4daa-b06d-df1105c0e566\") " pod="openstack/keystone-ac9c-account-create-update-9jpmx" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.617865 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-xv5gk\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.617888 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxzhg\" (UniqueName: \"kubernetes.io/projected/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-kube-api-access-kxzhg\") pod \"dnsmasq-dns-7f896c8c65-xv5gk\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.617905 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-config\") pod \"dnsmasq-dns-7f896c8c65-xv5gk\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.617968 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a41b81d7-fa6c-4daa-b06d-df1105c0e566-operator-scripts\") pod \"keystone-ac9c-account-create-update-9jpmx\" (UID: \"a41b81d7-fa6c-4daa-b06d-df1105c0e566\") " pod="openstack/keystone-ac9c-account-create-update-9jpmx" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.618026 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm22m\" (UniqueName: \"kubernetes.io/projected/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f-kube-api-access-xm22m\") pod \"keystone-db-create-k98qz\" (UID: \"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f\") " pod="openstack/keystone-db-create-k98qz" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.618066 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-xv5gk\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.618081 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f-operator-scripts\") pod \"keystone-db-create-k98qz\" (UID: \"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f\") " pod="openstack/keystone-db-create-k98qz" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.618877 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-xv5gk\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.619620 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-config\") pod \"dnsmasq-dns-7f896c8c65-xv5gk\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.620987 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-xv5gk\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.621521 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.639079 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-szj8g"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.657234 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxzhg\" (UniqueName: \"kubernetes.io/projected/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-kube-api-access-kxzhg\") pod \"dnsmasq-dns-7f896c8c65-xv5gk\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.673831 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.676038 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.681648 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.681805 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.681930 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.682026 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-wh8rs" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.711023 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.720897 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.721921 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a41b81d7-fa6c-4daa-b06d-df1105c0e566-operator-scripts\") pod \"keystone-ac9c-account-create-update-9jpmx\" (UID: \"a41b81d7-fa6c-4daa-b06d-df1105c0e566\") " pod="openstack/keystone-ac9c-account-create-update-9jpmx" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.722028 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4361dca-0563-4576-a32a-2f03e4f399a0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.722073 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c4361dca-0563-4576-a32a-2f03e4f399a0-ovs-rundir\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.722109 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm22m\" (UniqueName: \"kubernetes.io/projected/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f-kube-api-access-xm22m\") pod \"keystone-db-create-k98qz\" (UID: \"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f\") " pod="openstack/keystone-db-create-k98qz" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.722162 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f-operator-scripts\") pod \"keystone-db-create-k98qz\" (UID: \"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f\") " pod="openstack/keystone-db-create-k98qz" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.722191 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c4361dca-0563-4576-a32a-2f03e4f399a0-ovn-rundir\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.722209 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4361dca-0563-4576-a32a-2f03e4f399a0-config\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.722240 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvckd\" (UniqueName: \"kubernetes.io/projected/a41b81d7-fa6c-4daa-b06d-df1105c0e566-kube-api-access-nvckd\") pod \"keystone-ac9c-account-create-update-9jpmx\" (UID: \"a41b81d7-fa6c-4daa-b06d-df1105c0e566\") " pod="openstack/keystone-ac9c-account-create-update-9jpmx" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.722277 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4361dca-0563-4576-a32a-2f03e4f399a0-combined-ca-bundle\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.722335 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmcv\" (UniqueName: \"kubernetes.io/projected/c4361dca-0563-4576-a32a-2f03e4f399a0-kube-api-access-xbmcv\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.723071 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a41b81d7-fa6c-4daa-b06d-df1105c0e566-operator-scripts\") pod \"keystone-ac9c-account-create-update-9jpmx\" (UID: \"a41b81d7-fa6c-4daa-b06d-df1105c0e566\") " pod="openstack/keystone-ac9c-account-create-update-9jpmx" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.723862 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f-operator-scripts\") pod \"keystone-db-create-k98qz\" (UID: \"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f\") " pod="openstack/keystone-db-create-k98qz" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.739097 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.739153 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.771614 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvckd\" (UniqueName: \"kubernetes.io/projected/a41b81d7-fa6c-4daa-b06d-df1105c0e566-kube-api-access-nvckd\") pod \"keystone-ac9c-account-create-update-9jpmx\" (UID: \"a41b81d7-fa6c-4daa-b06d-df1105c0e566\") " pod="openstack/keystone-ac9c-account-create-update-9jpmx" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.774112 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ac9c-account-create-update-9jpmx" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.776324 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm22m\" (UniqueName: \"kubernetes.io/projected/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f-kube-api-access-xm22m\") pod \"keystone-db-create-k98qz\" (UID: \"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f\") " pod="openstack/keystone-db-create-k98qz" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.790559 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4ms4n"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.829611 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c4361dca-0563-4576-a32a-2f03e4f399a0-ovn-rundir\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.829651 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4361dca-0563-4576-a32a-2f03e4f399a0-config\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.829720 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.829743 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4361dca-0563-4576-a32a-2f03e4f399a0-combined-ca-bundle\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.829762 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xbz8\" (UniqueName: \"kubernetes.io/projected/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-kube-api-access-5xbz8\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.829788 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-scripts\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.829832 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-config\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.829874 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmcv\" (UniqueName: \"kubernetes.io/projected/c4361dca-0563-4576-a32a-2f03e4f399a0-kube-api-access-xbmcv\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.829909 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.829933 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4361dca-0563-4576-a32a-2f03e4f399a0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.829952 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c4361dca-0563-4576-a32a-2f03e4f399a0-ovs-rundir\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.829976 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.830009 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.830315 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c4361dca-0563-4576-a32a-2f03e4f399a0-ovn-rundir\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.830991 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4361dca-0563-4576-a32a-2f03e4f399a0-config\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.835036 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c4361dca-0563-4576-a32a-2f03e4f399a0-ovs-rundir\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.848337 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k98qz" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.859149 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4361dca-0563-4576-a32a-2f03e4f399a0-combined-ca-bundle\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.871454 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4361dca-0563-4576-a32a-2f03e4f399a0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.906094 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmcv\" (UniqueName: \"kubernetes.io/projected/c4361dca-0563-4576-a32a-2f03e4f399a0-kube-api-access-xbmcv\") pod \"ovn-controller-metrics-szj8g\" (UID: \"c4361dca-0563-4576-a32a-2f03e4f399a0\") " pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.940187 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.940255 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xbz8\" (UniqueName: \"kubernetes.io/projected/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-kube-api-access-5xbz8\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.940278 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-scripts\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.940404 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-config\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.940453 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.940520 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.942751 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.944413 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.945406 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-config\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.945980 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-scripts\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.950145 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.960867 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-szj8g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.972585 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-j4d2g"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.976403 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.988555 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-j4d2g"] Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.988774 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:51 crc kubenswrapper[4628]: I1211 05:30:51.991423 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.000518 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.011519 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xbz8\" (UniqueName: \"kubernetes.io/projected/0ba5be80-485c-4b8b-8e1d-3326db7cc5a0-kube-api-access-5xbz8\") pod \"ovn-northd-0\" (UID: \"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0\") " pod="openstack/ovn-northd-0" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.032358 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-mzcb5"] Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.034175 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mzcb5" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.034795 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.064596 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-config\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.064737 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxz29\" (UniqueName: \"kubernetes.io/projected/69514995-7ac6-4dea-b519-317e80b5f9fd-kube-api-access-gxz29\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.064774 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.064876 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.064954 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.076641 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mzcb5"] Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.134888 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2a5a-account-create-update-qmvjk"] Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.136035 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a5a-account-create-update-qmvjk" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.144613 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" podUID="72259b41-2e95-4531-a2ff-2939e437253c" containerName="dnsmasq-dns" containerID="cri-o://50ef0e9377a23d7f2adb8af12c0fd32e76aa5ede5aa6c12737ce551c61461f01" gracePeriod=10 Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.144943 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.166195 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25fln\" (UniqueName: \"kubernetes.io/projected/1318bf7e-ab46-425f-b121-4423d0623af6-kube-api-access-25fln\") pod \"placement-db-create-mzcb5\" (UID: \"1318bf7e-ab46-425f-b121-4423d0623af6\") " pod="openstack/placement-db-create-mzcb5" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.166245 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxz29\" (UniqueName: \"kubernetes.io/projected/69514995-7ac6-4dea-b519-317e80b5f9fd-kube-api-access-gxz29\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.166271 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.166296 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m22p9\" (UniqueName: \"kubernetes.io/projected/08120cb0-d26e-4520-a972-829bde3491dc-kube-api-access-m22p9\") pod \"placement-2a5a-account-create-update-qmvjk\" (UID: \"08120cb0-d26e-4520-a972-829bde3491dc\") " pod="openstack/placement-2a5a-account-create-update-qmvjk" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.166335 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.166362 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08120cb0-d26e-4520-a972-829bde3491dc-operator-scripts\") pod \"placement-2a5a-account-create-update-qmvjk\" (UID: \"08120cb0-d26e-4520-a972-829bde3491dc\") " pod="openstack/placement-2a5a-account-create-update-qmvjk" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.166391 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1318bf7e-ab46-425f-b121-4423d0623af6-operator-scripts\") pod \"placement-db-create-mzcb5\" (UID: \"1318bf7e-ab46-425f-b121-4423d0623af6\") " pod="openstack/placement-db-create-mzcb5" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.166415 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.166466 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-config\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.168134 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.169276 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.169795 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.183469 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-config\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.204330 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2a5a-account-create-update-qmvjk"] Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.216536 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxz29\" (UniqueName: \"kubernetes.io/projected/69514995-7ac6-4dea-b519-317e80b5f9fd-kube-api-access-gxz29\") pod \"dnsmasq-dns-86db49b7ff-j4d2g\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.270681 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25fln\" (UniqueName: \"kubernetes.io/projected/1318bf7e-ab46-425f-b121-4423d0623af6-kube-api-access-25fln\") pod \"placement-db-create-mzcb5\" (UID: \"1318bf7e-ab46-425f-b121-4423d0623af6\") " pod="openstack/placement-db-create-mzcb5" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.270745 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m22p9\" (UniqueName: \"kubernetes.io/projected/08120cb0-d26e-4520-a972-829bde3491dc-kube-api-access-m22p9\") pod \"placement-2a5a-account-create-update-qmvjk\" (UID: \"08120cb0-d26e-4520-a972-829bde3491dc\") " pod="openstack/placement-2a5a-account-create-update-qmvjk" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.270796 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08120cb0-d26e-4520-a972-829bde3491dc-operator-scripts\") pod \"placement-2a5a-account-create-update-qmvjk\" (UID: \"08120cb0-d26e-4520-a972-829bde3491dc\") " pod="openstack/placement-2a5a-account-create-update-qmvjk" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.270838 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1318bf7e-ab46-425f-b121-4423d0623af6-operator-scripts\") pod \"placement-db-create-mzcb5\" (UID: \"1318bf7e-ab46-425f-b121-4423d0623af6\") " pod="openstack/placement-db-create-mzcb5" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.271864 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08120cb0-d26e-4520-a972-829bde3491dc-operator-scripts\") pod \"placement-2a5a-account-create-update-qmvjk\" (UID: \"08120cb0-d26e-4520-a972-829bde3491dc\") " pod="openstack/placement-2a5a-account-create-update-qmvjk" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.272669 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1318bf7e-ab46-425f-b121-4423d0623af6-operator-scripts\") pod \"placement-db-create-mzcb5\" (UID: \"1318bf7e-ab46-425f-b121-4423d0623af6\") " pod="openstack/placement-db-create-mzcb5" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.293040 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25fln\" (UniqueName: \"kubernetes.io/projected/1318bf7e-ab46-425f-b121-4423d0623af6-kube-api-access-25fln\") pod \"placement-db-create-mzcb5\" (UID: \"1318bf7e-ab46-425f-b121-4423d0623af6\") " pod="openstack/placement-db-create-mzcb5" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.300479 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m22p9\" (UniqueName: \"kubernetes.io/projected/08120cb0-d26e-4520-a972-829bde3491dc-kube-api-access-m22p9\") pod \"placement-2a5a-account-create-update-qmvjk\" (UID: \"08120cb0-d26e-4520-a972-829bde3491dc\") " pod="openstack/placement-2a5a-account-create-update-qmvjk" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.346373 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.371959 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hqrl\" (UniqueName: \"kubernetes.io/projected/534bc26b-a52d-40e7-8ac2-4aa407d070a9-kube-api-access-6hqrl\") pod \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\" (UID: \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\") " Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.371997 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/534bc26b-a52d-40e7-8ac2-4aa407d070a9-dns-svc\") pod \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\" (UID: \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\") " Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.372100 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/534bc26b-a52d-40e7-8ac2-4aa407d070a9-config\") pod \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\" (UID: \"534bc26b-a52d-40e7-8ac2-4aa407d070a9\") " Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.372783 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/534bc26b-a52d-40e7-8ac2-4aa407d070a9-config" (OuterVolumeSpecName: "config") pod "534bc26b-a52d-40e7-8ac2-4aa407d070a9" (UID: "534bc26b-a52d-40e7-8ac2-4aa407d070a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.376946 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534bc26b-a52d-40e7-8ac2-4aa407d070a9-kube-api-access-6hqrl" (OuterVolumeSpecName: "kube-api-access-6hqrl") pod "534bc26b-a52d-40e7-8ac2-4aa407d070a9" (UID: "534bc26b-a52d-40e7-8ac2-4aa407d070a9"). InnerVolumeSpecName "kube-api-access-6hqrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.377605 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/534bc26b-a52d-40e7-8ac2-4aa407d070a9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "534bc26b-a52d-40e7-8ac2-4aa407d070a9" (UID: "534bc26b-a52d-40e7-8ac2-4aa407d070a9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.405725 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.444363 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mzcb5" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.474533 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hqrl\" (UniqueName: \"kubernetes.io/projected/534bc26b-a52d-40e7-8ac2-4aa407d070a9-kube-api-access-6hqrl\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.474555 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/534bc26b-a52d-40e7-8ac2-4aa407d070a9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.474564 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/534bc26b-a52d-40e7-8ac2-4aa407d070a9-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.488233 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a5a-account-create-update-qmvjk" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.590011 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.770030 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.853208 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-szj8g"] Dec 11 05:30:52 crc kubenswrapper[4628]: W1211 05:30:52.928331 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ed312d5_c8ce_42d7_90bd_49e3ef4f5b6f.slice/crio-222c5d1b70f0f09aeb4bc3fa0c04f30219b0c569f84c2824ece1bd0ff755677a WatchSource:0}: Error finding container 222c5d1b70f0f09aeb4bc3fa0c04f30219b0c569f84c2824ece1bd0ff755677a: Status 404 returned error can't find the container with id 222c5d1b70f0f09aeb4bc3fa0c04f30219b0c569f84c2824ece1bd0ff755677a Dec 11 05:30:52 crc kubenswrapper[4628]: I1211 05:30:52.934604 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-k98qz"] Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.018123 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ac9c-account-create-update-9jpmx"] Dec 11 05:30:53 crc kubenswrapper[4628]: W1211 05:30:53.038233 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda41b81d7_fa6c_4daa_b06d_df1105c0e566.slice/crio-ab177afa97e4e404dc97276c0081c5b66386c4e4a87dc994439a808b9e91784c WatchSource:0}: Error finding container ab177afa97e4e404dc97276c0081c5b66386c4e4a87dc994439a808b9e91784c: Status 404 returned error can't find the container with id ab177afa97e4e404dc97276c0081c5b66386c4e4a87dc994439a808b9e91784c Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.039523 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 11 05:30:53 crc kubenswrapper[4628]: W1211 05:30:53.040626 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ba5be80_485c_4b8b_8e1d_3326db7cc5a0.slice/crio-778f6e0300e17cee6058ae43dd2e2e7c6ac367f4fb5e2efd82183c6ee4f49e4c WatchSource:0}: Error finding container 778f6e0300e17cee6058ae43dd2e2e7c6ac367f4fb5e2efd82183c6ee4f49e4c: Status 404 returned error can't find the container with id 778f6e0300e17cee6058ae43dd2e2e7c6ac367f4fb5e2efd82183c6ee4f49e4c Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.047910 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xv5gk"] Dec 11 05:30:53 crc kubenswrapper[4628]: W1211 05:30:53.050972 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4aa3d172_61ed_43ef_b6c6_cddc2c80565b.slice/crio-cee9faaf7b63d59afdcb5277216cbf9bfa4fe2c955209e00db03cbed68a97d03 WatchSource:0}: Error finding container cee9faaf7b63d59afdcb5277216cbf9bfa4fe2c955209e00db03cbed68a97d03: Status 404 returned error can't find the container with id cee9faaf7b63d59afdcb5277216cbf9bfa4fe2c955209e00db03cbed68a97d03 Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.145501 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-j4d2g"] Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.157344 4628 generic.go:334] "Generic (PLEG): container finished" podID="72259b41-2e95-4531-a2ff-2939e437253c" containerID="50ef0e9377a23d7f2adb8af12c0fd32e76aa5ede5aa6c12737ce551c61461f01" exitCode=0 Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.157426 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" event={"ID":"72259b41-2e95-4531-a2ff-2939e437253c","Type":"ContainerDied","Data":"50ef0e9377a23d7f2adb8af12c0fd32e76aa5ede5aa6c12737ce551c61461f01"} Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.160049 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-szj8g" event={"ID":"c4361dca-0563-4576-a32a-2f03e4f399a0","Type":"ContainerStarted","Data":"dcd0425d7cc91851378468b669ffd6eb592a3dd8deab758561e228471b35cc3a"} Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.161153 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0","Type":"ContainerStarted","Data":"778f6e0300e17cee6058ae43dd2e2e7c6ac367f4fb5e2efd82183c6ee4f49e4c"} Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.162837 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k98qz" event={"ID":"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f","Type":"ContainerStarted","Data":"222c5d1b70f0f09aeb4bc3fa0c04f30219b0c569f84c2824ece1bd0ff755677a"} Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.165696 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ac9c-account-create-update-9jpmx" event={"ID":"a41b81d7-fa6c-4daa-b06d-df1105c0e566","Type":"ContainerStarted","Data":"ab177afa97e4e404dc97276c0081c5b66386c4e4a87dc994439a808b9e91784c"} Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.167776 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" event={"ID":"534bc26b-a52d-40e7-8ac2-4aa407d070a9","Type":"ContainerDied","Data":"7f06b67a6f47fc73ef35f9a9318d29e11e045a40639362f1ac05c45c8aa2862c"} Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.167833 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-bxzmm" Dec 11 05:30:53 crc kubenswrapper[4628]: W1211 05:30:53.174881 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69514995_7ac6_4dea_b519_317e80b5f9fd.slice/crio-7acdd4c67e95f9d9d548437fb0fe038f35d4fae3ddb3224ac8b34bcfad19e922 WatchSource:0}: Error finding container 7acdd4c67e95f9d9d548437fb0fe038f35d4fae3ddb3224ac8b34bcfad19e922: Status 404 returned error can't find the container with id 7acdd4c67e95f9d9d548437fb0fe038f35d4fae3ddb3224ac8b34bcfad19e922 Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.175441 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" event={"ID":"4aa3d172-61ed-43ef-b6c6-cddc2c80565b","Type":"ContainerStarted","Data":"cee9faaf7b63d59afdcb5277216cbf9bfa4fe2c955209e00db03cbed68a97d03"} Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.232157 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bxzmm"] Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.238898 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-bxzmm"] Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.291435 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2a5a-account-create-update-qmvjk"] Dec 11 05:30:53 crc kubenswrapper[4628]: W1211 05:30:53.309391 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1318bf7e_ab46_425f_b121_4423d0623af6.slice/crio-fa18d93ea9517f950c38b835cc5537f4094425ede84248595227429a90f33e85 WatchSource:0}: Error finding container fa18d93ea9517f950c38b835cc5537f4094425ede84248595227429a90f33e85: Status 404 returned error can't find the container with id fa18d93ea9517f950c38b835cc5537f4094425ede84248595227429a90f33e85 Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.324581 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mzcb5"] Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.601088 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 11 05:30:53 crc kubenswrapper[4628]: I1211 05:30:53.900828 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="534bc26b-a52d-40e7-8ac2-4aa407d070a9" path="/var/lib/kubelet/pods/534bc26b-a52d-40e7-8ac2-4aa407d070a9/volumes" Dec 11 05:30:54 crc kubenswrapper[4628]: I1211 05:30:54.183960 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2a5a-account-create-update-qmvjk" event={"ID":"08120cb0-d26e-4520-a972-829bde3491dc","Type":"ContainerStarted","Data":"29b2e5fa9e8cb4d0ec4a0fde4c1726a089c613749c163c4bf350356651371bde"} Dec 11 05:30:54 crc kubenswrapper[4628]: I1211 05:30:54.186022 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" event={"ID":"69514995-7ac6-4dea-b519-317e80b5f9fd","Type":"ContainerStarted","Data":"7acdd4c67e95f9d9d548437fb0fe038f35d4fae3ddb3224ac8b34bcfad19e922"} Dec 11 05:30:54 crc kubenswrapper[4628]: I1211 05:30:54.187928 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mzcb5" event={"ID":"1318bf7e-ab46-425f-b121-4423d0623af6","Type":"ContainerStarted","Data":"fa18d93ea9517f950c38b835cc5537f4094425ede84248595227429a90f33e85"} Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.484734 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xv5gk"] Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.533363 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-cdwch"] Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.534873 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.566644 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-cdwch"] Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.711080 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-dns-svc\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.711150 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-config\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.711187 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxqj4\" (UniqueName: \"kubernetes.io/projected/2f6eacbe-bd53-4695-9411-efe751202c1b-kube-api-access-kxqj4\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.711216 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.711247 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.812337 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.812433 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-dns-svc\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.812470 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-config\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.812503 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxqj4\" (UniqueName: \"kubernetes.io/projected/2f6eacbe-bd53-4695-9411-efe751202c1b-kube-api-access-kxqj4\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.812538 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.813387 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.813516 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-dns-svc\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.813648 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.813874 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-config\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.839263 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxqj4\" (UniqueName: \"kubernetes.io/projected/2f6eacbe-bd53-4695-9411-efe751202c1b-kube-api-access-kxqj4\") pod \"dnsmasq-dns-698758b865-cdwch\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:55 crc kubenswrapper[4628]: I1211 05:30:55.870953 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.339157 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-cdwch"] Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.694458 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.699248 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.703858 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.704115 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rj8j6" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.704299 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.704439 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.738163 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.826735 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.827024 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/612f2afd-9958-4367-a8c0-13066a05cd11-lock\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.827144 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.827299 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz7w8\" (UniqueName: \"kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-kube-api-access-jz7w8\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.827500 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/612f2afd-9958-4367-a8c0-13066a05cd11-cache\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.929084 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/612f2afd-9958-4367-a8c0-13066a05cd11-cache\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.929150 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.929190 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/612f2afd-9958-4367-a8c0-13066a05cd11-lock\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.929225 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.929260 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz7w8\" (UniqueName: \"kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-kube-api-access-jz7w8\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: E1211 05:30:56.929327 4628 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 11 05:30:56 crc kubenswrapper[4628]: E1211 05:30:56.929350 4628 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 11 05:30:56 crc kubenswrapper[4628]: E1211 05:30:56.929405 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift podName:612f2afd-9958-4367-a8c0-13066a05cd11 nodeName:}" failed. No retries permitted until 2025-12-11 05:30:57.429385534 +0000 UTC m=+959.846732232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift") pod "swift-storage-0" (UID: "612f2afd-9958-4367-a8c0-13066a05cd11") : configmap "swift-ring-files" not found Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.929740 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/612f2afd-9958-4367-a8c0-13066a05cd11-cache\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.929775 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/612f2afd-9958-4367-a8c0-13066a05cd11-lock\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.929795 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.951728 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz7w8\" (UniqueName: \"kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-kube-api-access-jz7w8\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:56 crc kubenswrapper[4628]: I1211 05:30:56.953352 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.051284 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-24c8x"] Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.052687 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-24c8x" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.057954 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c49b-account-create-update-d7hdm"] Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.058807 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c49b-account-create-update-d7hdm" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.060110 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.070110 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-24c8x"] Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.089358 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c49b-account-create-update-d7hdm"] Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.233751 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/300bbc9f-12f3-4da2-bb3f-85a458b574cf-operator-scripts\") pod \"glance-db-create-24c8x\" (UID: \"300bbc9f-12f3-4da2-bb3f-85a458b574cf\") " pod="openstack/glance-db-create-24c8x" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.234320 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ede4d9-f3c1-4e13-948d-abd83adb1397-operator-scripts\") pod \"glance-c49b-account-create-update-d7hdm\" (UID: \"f1ede4d9-f3c1-4e13-948d-abd83adb1397\") " pod="openstack/glance-c49b-account-create-update-d7hdm" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.234525 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lkcc\" (UniqueName: \"kubernetes.io/projected/300bbc9f-12f3-4da2-bb3f-85a458b574cf-kube-api-access-2lkcc\") pod \"glance-db-create-24c8x\" (UID: \"300bbc9f-12f3-4da2-bb3f-85a458b574cf\") " pod="openstack/glance-db-create-24c8x" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.234687 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckfbd\" (UniqueName: \"kubernetes.io/projected/f1ede4d9-f3c1-4e13-948d-abd83adb1397-kube-api-access-ckfbd\") pod \"glance-c49b-account-create-update-d7hdm\" (UID: \"f1ede4d9-f3c1-4e13-948d-abd83adb1397\") " pod="openstack/glance-c49b-account-create-update-d7hdm" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.303728 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-cdwch" event={"ID":"2f6eacbe-bd53-4695-9411-efe751202c1b","Type":"ContainerStarted","Data":"7ccd27ea8702017b74583963f965a334b4f250fa1153e6d6bdc1e1c5490f1f2d"} Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.335954 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/300bbc9f-12f3-4da2-bb3f-85a458b574cf-operator-scripts\") pod \"glance-db-create-24c8x\" (UID: \"300bbc9f-12f3-4da2-bb3f-85a458b574cf\") " pod="openstack/glance-db-create-24c8x" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.336124 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ede4d9-f3c1-4e13-948d-abd83adb1397-operator-scripts\") pod \"glance-c49b-account-create-update-d7hdm\" (UID: \"f1ede4d9-f3c1-4e13-948d-abd83adb1397\") " pod="openstack/glance-c49b-account-create-update-d7hdm" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.336172 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lkcc\" (UniqueName: \"kubernetes.io/projected/300bbc9f-12f3-4da2-bb3f-85a458b574cf-kube-api-access-2lkcc\") pod \"glance-db-create-24c8x\" (UID: \"300bbc9f-12f3-4da2-bb3f-85a458b574cf\") " pod="openstack/glance-db-create-24c8x" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.336199 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckfbd\" (UniqueName: \"kubernetes.io/projected/f1ede4d9-f3c1-4e13-948d-abd83adb1397-kube-api-access-ckfbd\") pod \"glance-c49b-account-create-update-d7hdm\" (UID: \"f1ede4d9-f3c1-4e13-948d-abd83adb1397\") " pod="openstack/glance-c49b-account-create-update-d7hdm" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.338330 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/300bbc9f-12f3-4da2-bb3f-85a458b574cf-operator-scripts\") pod \"glance-db-create-24c8x\" (UID: \"300bbc9f-12f3-4da2-bb3f-85a458b574cf\") " pod="openstack/glance-db-create-24c8x" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.338699 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ede4d9-f3c1-4e13-948d-abd83adb1397-operator-scripts\") pod \"glance-c49b-account-create-update-d7hdm\" (UID: \"f1ede4d9-f3c1-4e13-948d-abd83adb1397\") " pod="openstack/glance-c49b-account-create-update-d7hdm" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.360085 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lkcc\" (UniqueName: \"kubernetes.io/projected/300bbc9f-12f3-4da2-bb3f-85a458b574cf-kube-api-access-2lkcc\") pod \"glance-db-create-24c8x\" (UID: \"300bbc9f-12f3-4da2-bb3f-85a458b574cf\") " pod="openstack/glance-db-create-24c8x" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.360462 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckfbd\" (UniqueName: \"kubernetes.io/projected/f1ede4d9-f3c1-4e13-948d-abd83adb1397-kube-api-access-ckfbd\") pod \"glance-c49b-account-create-update-d7hdm\" (UID: \"f1ede4d9-f3c1-4e13-948d-abd83adb1397\") " pod="openstack/glance-c49b-account-create-update-d7hdm" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.368697 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-24c8x" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.374943 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c49b-account-create-update-d7hdm" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.438252 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:57 crc kubenswrapper[4628]: E1211 05:30:57.438469 4628 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 11 05:30:57 crc kubenswrapper[4628]: E1211 05:30:57.438498 4628 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 11 05:30:57 crc kubenswrapper[4628]: E1211 05:30:57.438556 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift podName:612f2afd-9958-4367-a8c0-13066a05cd11 nodeName:}" failed. No retries permitted until 2025-12-11 05:30:58.43853778 +0000 UTC m=+960.855884478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift") pod "swift-storage-0" (UID: "612f2afd-9958-4367-a8c0-13066a05cd11") : configmap "swift-ring-files" not found Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.544912 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" podUID="72259b41-2e95-4531-a2ff-2939e437253c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.98:5353: connect: connection refused" Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.933439 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-t4rhc"] Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.983583 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c49b-account-create-update-d7hdm"] Dec 11 05:30:57 crc kubenswrapper[4628]: I1211 05:30:57.983820 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.001270 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.047245 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.047627 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.047792 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.061913 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-ring-data-devices\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.061993 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-swiftconf\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.062048 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-etc-swift\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.062129 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-scripts\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.062165 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjlvh\" (UniqueName: \"kubernetes.io/projected/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-kube-api-access-bjlvh\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.062240 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-dispersionconf\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.062337 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-combined-ca-bundle\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.062504 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-t4rhc"] Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.164428 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-ring-data-devices\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.164497 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-swiftconf\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.164541 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-etc-swift\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.164579 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-scripts\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.164603 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjlvh\" (UniqueName: \"kubernetes.io/projected/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-kube-api-access-bjlvh\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.164661 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-dispersionconf\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.164728 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-combined-ca-bundle\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.168510 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-etc-swift\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.185567 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-ring-data-devices\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.186935 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-scripts\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.215939 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-combined-ca-bundle\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.218422 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-dispersionconf\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.241692 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-swiftconf\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.254015 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjlvh\" (UniqueName: \"kubernetes.io/projected/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-kube-api-access-bjlvh\") pod \"swift-ring-rebalance-t4rhc\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.366254 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" event={"ID":"4aa3d172-61ed-43ef-b6c6-cddc2c80565b","Type":"ContainerStarted","Data":"b50a4b614aed61619ba1364f5a4d6b264dd73241f82594fb2a19a4c6ae09ccdc"} Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.378667 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-cdwch" event={"ID":"2f6eacbe-bd53-4695-9411-efe751202c1b","Type":"ContainerStarted","Data":"37e5f8a8a3dce4af5c0d900bf0f8d9303fdecb84fd394e95b2b5df0760f89754"} Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.388032 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-szj8g" event={"ID":"c4361dca-0563-4576-a32a-2f03e4f399a0","Type":"ContainerStarted","Data":"10a2b92809b94afbcca6cf20b332e1343c4a1f80a38e2cafcc271acd51869cd7"} Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.399053 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k98qz" event={"ID":"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f","Type":"ContainerStarted","Data":"64b1947fedcd6a1a6e6e81a6e66951f505d75885d5b1ebc254bdae52874769b0"} Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.401540 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ac9c-account-create-update-9jpmx" event={"ID":"a41b81d7-fa6c-4daa-b06d-df1105c0e566","Type":"ContainerStarted","Data":"dba35f191d2ecfc5135c3769ee1a25f42bbe3da0164752941bf2604ba1af84c7"} Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.403139 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mzcb5" event={"ID":"1318bf7e-ab46-425f-b121-4423d0623af6","Type":"ContainerStarted","Data":"d3dabf5ed46d314ca946e16acce857c86397b012fa65749804022453003eced9"} Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.404335 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c49b-account-create-update-d7hdm" event={"ID":"f1ede4d9-f3c1-4e13-948d-abd83adb1397","Type":"ContainerStarted","Data":"37e6cff88ecea61add26e5db05546514469e1d1a128a25f762a55d17929bb438"} Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.408746 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rj8j6" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.414838 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.437257 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-k98qz" podStartSLOduration=7.437240174 podStartE2EDuration="7.437240174s" podCreationTimestamp="2025-12-11 05:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:30:58.425946895 +0000 UTC m=+960.843293593" watchObservedRunningTime="2025-12-11 05:30:58.437240174 +0000 UTC m=+960.854586872" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.458297 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-ac9c-account-create-update-9jpmx" podStartSLOduration=7.458280231 podStartE2EDuration="7.458280231s" podCreationTimestamp="2025-12-11 05:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:30:58.455620862 +0000 UTC m=+960.872967560" watchObservedRunningTime="2025-12-11 05:30:58.458280231 +0000 UTC m=+960.875626949" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.484312 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-szj8g" podStartSLOduration=7.484294921 podStartE2EDuration="7.484294921s" podCreationTimestamp="2025-12-11 05:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:30:58.473642268 +0000 UTC m=+960.890988966" watchObservedRunningTime="2025-12-11 05:30:58.484294921 +0000 UTC m=+960.901641619" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.486689 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:30:58 crc kubenswrapper[4628]: E1211 05:30:58.489417 4628 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 11 05:30:58 crc kubenswrapper[4628]: E1211 05:30:58.489432 4628 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 11 05:30:58 crc kubenswrapper[4628]: E1211 05:30:58.489469 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift podName:612f2afd-9958-4367-a8c0-13066a05cd11 nodeName:}" failed. No retries permitted until 2025-12-11 05:31:00.489455008 +0000 UTC m=+962.906801706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift") pod "swift-storage-0" (UID: "612f2afd-9958-4367-a8c0-13066a05cd11") : configmap "swift-ring-files" not found Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.558573 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.567376 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-24c8x"] Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.692710 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72259b41-2e95-4531-a2ff-2939e437253c-config\") pod \"72259b41-2e95-4531-a2ff-2939e437253c\" (UID: \"72259b41-2e95-4531-a2ff-2939e437253c\") " Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.693089 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bq6n\" (UniqueName: \"kubernetes.io/projected/72259b41-2e95-4531-a2ff-2939e437253c-kube-api-access-5bq6n\") pod \"72259b41-2e95-4531-a2ff-2939e437253c\" (UID: \"72259b41-2e95-4531-a2ff-2939e437253c\") " Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.693186 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72259b41-2e95-4531-a2ff-2939e437253c-dns-svc\") pod \"72259b41-2e95-4531-a2ff-2939e437253c\" (UID: \"72259b41-2e95-4531-a2ff-2939e437253c\") " Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.700667 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72259b41-2e95-4531-a2ff-2939e437253c-kube-api-access-5bq6n" (OuterVolumeSpecName: "kube-api-access-5bq6n") pod "72259b41-2e95-4531-a2ff-2939e437253c" (UID: "72259b41-2e95-4531-a2ff-2939e437253c"). InnerVolumeSpecName "kube-api-access-5bq6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.756735 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72259b41-2e95-4531-a2ff-2939e437253c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "72259b41-2e95-4531-a2ff-2939e437253c" (UID: "72259b41-2e95-4531-a2ff-2939e437253c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.781064 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72259b41-2e95-4531-a2ff-2939e437253c-config" (OuterVolumeSpecName: "config") pod "72259b41-2e95-4531-a2ff-2939e437253c" (UID: "72259b41-2e95-4531-a2ff-2939e437253c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.795625 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72259b41-2e95-4531-a2ff-2939e437253c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.795656 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72259b41-2e95-4531-a2ff-2939e437253c-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:58 crc kubenswrapper[4628]: I1211 05:30:58.795666 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bq6n\" (UniqueName: \"kubernetes.io/projected/72259b41-2e95-4531-a2ff-2939e437253c-kube-api-access-5bq6n\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.083385 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-t4rhc"] Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.415107 4628 generic.go:334] "Generic (PLEG): container finished" podID="69514995-7ac6-4dea-b519-317e80b5f9fd" containerID="6026922b70aa95009023ff47b8bba02a48e5ead8ee279bc6882f2ae6cf6cc5e8" exitCode=0 Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.415168 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" event={"ID":"69514995-7ac6-4dea-b519-317e80b5f9fd","Type":"ContainerDied","Data":"6026922b70aa95009023ff47b8bba02a48e5ead8ee279bc6882f2ae6cf6cc5e8"} Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.417284 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-t4rhc" event={"ID":"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1","Type":"ContainerStarted","Data":"62afa1516216500faffb04826e3605338be9104e3f863e13454e4b4f5bd17168"} Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.420074 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-24c8x" event={"ID":"300bbc9f-12f3-4da2-bb3f-85a458b574cf","Type":"ContainerStarted","Data":"3836ed00ef5c73025e39977eb2120f3f5cf6f09ccc0bc2fa86b07297ea274b62"} Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.420376 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-24c8x" event={"ID":"300bbc9f-12f3-4da2-bb3f-85a458b574cf","Type":"ContainerStarted","Data":"a3e1047c35f629dee104c67f18b957f4a8baa93515290d2419eb06c28565df0b"} Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.422577 4628 generic.go:334] "Generic (PLEG): container finished" podID="2f6eacbe-bd53-4695-9411-efe751202c1b" containerID="37e5f8a8a3dce4af5c0d900bf0f8d9303fdecb84fd394e95b2b5df0760f89754" exitCode=0 Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.422632 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-cdwch" event={"ID":"2f6eacbe-bd53-4695-9411-efe751202c1b","Type":"ContainerDied","Data":"37e5f8a8a3dce4af5c0d900bf0f8d9303fdecb84fd394e95b2b5df0760f89754"} Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.427502 4628 generic.go:334] "Generic (PLEG): container finished" podID="08120cb0-d26e-4520-a972-829bde3491dc" containerID="d655cb6ef20d42acc69695f35f24e60440857fa52c04308eed046368968a76ce" exitCode=0 Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.427581 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2a5a-account-create-update-qmvjk" event={"ID":"08120cb0-d26e-4520-a972-829bde3491dc","Type":"ContainerDied","Data":"d655cb6ef20d42acc69695f35f24e60440857fa52c04308eed046368968a76ce"} Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.429144 4628 generic.go:334] "Generic (PLEG): container finished" podID="4aa3d172-61ed-43ef-b6c6-cddc2c80565b" containerID="b50a4b614aed61619ba1364f5a4d6b264dd73241f82594fb2a19a4c6ae09ccdc" exitCode=0 Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.429258 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" event={"ID":"4aa3d172-61ed-43ef-b6c6-cddc2c80565b","Type":"ContainerDied","Data":"b50a4b614aed61619ba1364f5a4d6b264dd73241f82594fb2a19a4c6ae09ccdc"} Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.434169 4628 generic.go:334] "Generic (PLEG): container finished" podID="6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f" containerID="64b1947fedcd6a1a6e6e81a6e66951f505d75885d5b1ebc254bdae52874769b0" exitCode=0 Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.434235 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k98qz" event={"ID":"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f","Type":"ContainerDied","Data":"64b1947fedcd6a1a6e6e81a6e66951f505d75885d5b1ebc254bdae52874769b0"} Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.440215 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" event={"ID":"72259b41-2e95-4531-a2ff-2939e437253c","Type":"ContainerDied","Data":"b09f065b504e0afc5873f315b7a7b94b143331a0842a628784021dd0e573d97a"} Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.440276 4628 scope.go:117] "RemoveContainer" containerID="50ef0e9377a23d7f2adb8af12c0fd32e76aa5ede5aa6c12737ce551c61461f01" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.440420 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4ms4n" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.462114 4628 generic.go:334] "Generic (PLEG): container finished" podID="a41b81d7-fa6c-4daa-b06d-df1105c0e566" containerID="dba35f191d2ecfc5135c3769ee1a25f42bbe3da0164752941bf2604ba1af84c7" exitCode=0 Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.462302 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ac9c-account-create-update-9jpmx" event={"ID":"a41b81d7-fa6c-4daa-b06d-df1105c0e566","Type":"ContainerDied","Data":"dba35f191d2ecfc5135c3769ee1a25f42bbe3da0164752941bf2604ba1af84c7"} Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.483054 4628 generic.go:334] "Generic (PLEG): container finished" podID="1318bf7e-ab46-425f-b121-4423d0623af6" containerID="d3dabf5ed46d314ca946e16acce857c86397b012fa65749804022453003eced9" exitCode=0 Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.483061 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-24c8x" podStartSLOduration=2.483041396 podStartE2EDuration="2.483041396s" podCreationTimestamp="2025-12-11 05:30:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:30:59.476244016 +0000 UTC m=+961.893590704" watchObservedRunningTime="2025-12-11 05:30:59.483041396 +0000 UTC m=+961.900388094" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.483150 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mzcb5" event={"ID":"1318bf7e-ab46-425f-b121-4423d0623af6","Type":"ContainerDied","Data":"d3dabf5ed46d314ca946e16acce857c86397b012fa65749804022453003eced9"} Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.486169 4628 generic.go:334] "Generic (PLEG): container finished" podID="f1ede4d9-f3c1-4e13-948d-abd83adb1397" containerID="2ef618827cdb881bfe76bf1c683e0699f7a0cb3b6f729c0565f73ab04b9fb4b7" exitCode=0 Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.487118 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c49b-account-create-update-d7hdm" event={"ID":"f1ede4d9-f3c1-4e13-948d-abd83adb1397","Type":"ContainerDied","Data":"2ef618827cdb881bfe76bf1c683e0699f7a0cb3b6f729c0565f73ab04b9fb4b7"} Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.646206 4628 scope.go:117] "RemoveContainer" containerID="5f8996d8a55741f8174cbaa5144807a280e7d66ba8e8511457a5325a007488ec" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.716744 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.719357 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4ms4n"] Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.728679 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4ms4n"] Dec 11 05:30:59 crc kubenswrapper[4628]: E1211 05:30:59.750257 4628 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 11 05:30:59 crc kubenswrapper[4628]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/69514995-7ac6-4dea-b519-317e80b5f9fd/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 11 05:30:59 crc kubenswrapper[4628]: > podSandboxID="7acdd4c67e95f9d9d548437fb0fe038f35d4fae3ddb3224ac8b34bcfad19e922" Dec 11 05:30:59 crc kubenswrapper[4628]: E1211 05:30:59.750397 4628 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 11 05:30:59 crc kubenswrapper[4628]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gxz29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86db49b7ff-j4d2g_openstack(69514995-7ac6-4dea-b519-317e80b5f9fd): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/69514995-7ac6-4dea-b519-317e80b5f9fd/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 11 05:30:59 crc kubenswrapper[4628]: > logger="UnhandledError" Dec 11 05:30:59 crc kubenswrapper[4628]: E1211 05:30:59.753526 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/69514995-7ac6-4dea-b519-317e80b5f9fd/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" podUID="69514995-7ac6-4dea-b519-317e80b5f9fd" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.814217 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-ovsdbserver-sb\") pod \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.814343 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-config\") pod \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.814381 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxzhg\" (UniqueName: \"kubernetes.io/projected/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-kube-api-access-kxzhg\") pod \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.814510 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-dns-svc\") pod \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\" (UID: \"4aa3d172-61ed-43ef-b6c6-cddc2c80565b\") " Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.818127 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-kube-api-access-kxzhg" (OuterVolumeSpecName: "kube-api-access-kxzhg") pod "4aa3d172-61ed-43ef-b6c6-cddc2c80565b" (UID: "4aa3d172-61ed-43ef-b6c6-cddc2c80565b"). InnerVolumeSpecName "kube-api-access-kxzhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.833088 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4aa3d172-61ed-43ef-b6c6-cddc2c80565b" (UID: "4aa3d172-61ed-43ef-b6c6-cddc2c80565b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.847361 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-config" (OuterVolumeSpecName: "config") pod "4aa3d172-61ed-43ef-b6c6-cddc2c80565b" (UID: "4aa3d172-61ed-43ef-b6c6-cddc2c80565b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.851538 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4aa3d172-61ed-43ef-b6c6-cddc2c80565b" (UID: "4aa3d172-61ed-43ef-b6c6-cddc2c80565b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.901115 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72259b41-2e95-4531-a2ff-2939e437253c" path="/var/lib/kubelet/pods/72259b41-2e95-4531-a2ff-2939e437253c/volumes" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.916034 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.916061 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.916071 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:30:59 crc kubenswrapper[4628]: I1211 05:30:59.916080 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxzhg\" (UniqueName: \"kubernetes.io/projected/4aa3d172-61ed-43ef-b6c6-cddc2c80565b-kube-api-access-kxzhg\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.497518 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.497509 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-xv5gk" event={"ID":"4aa3d172-61ed-43ef-b6c6-cddc2c80565b","Type":"ContainerDied","Data":"cee9faaf7b63d59afdcb5277216cbf9bfa4fe2c955209e00db03cbed68a97d03"} Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.498253 4628 scope.go:117] "RemoveContainer" containerID="b50a4b614aed61619ba1364f5a4d6b264dd73241f82594fb2a19a4c6ae09ccdc" Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.502324 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-cdwch" event={"ID":"2f6eacbe-bd53-4695-9411-efe751202c1b","Type":"ContainerStarted","Data":"f14bf99d128610949a878fd7fd299ac9e0d58370b422b1522bfa2f6179a73b35"} Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.503087 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.506952 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0","Type":"ContainerStarted","Data":"c3feded484f0ca8ae227286130541c6946f8c87a7d43723b76bdc0e272345b66"} Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.506985 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0ba5be80-485c-4b8b-8e1d-3326db7cc5a0","Type":"ContainerStarted","Data":"062e702cb06a7a66e847340f5a9ebf2a646f6fbb9d9e08f9be3340f45113b554"} Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.507514 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.509464 4628 generic.go:334] "Generic (PLEG): container finished" podID="300bbc9f-12f3-4da2-bb3f-85a458b574cf" containerID="3836ed00ef5c73025e39977eb2120f3f5cf6f09ccc0bc2fa86b07297ea274b62" exitCode=0 Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.509681 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-24c8x" event={"ID":"300bbc9f-12f3-4da2-bb3f-85a458b574cf","Type":"ContainerDied","Data":"3836ed00ef5c73025e39977eb2120f3f5cf6f09ccc0bc2fa86b07297ea274b62"} Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.538972 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:31:00 crc kubenswrapper[4628]: E1211 05:31:00.539185 4628 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 11 05:31:00 crc kubenswrapper[4628]: E1211 05:31:00.539199 4628 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 11 05:31:00 crc kubenswrapper[4628]: E1211 05:31:00.539260 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift podName:612f2afd-9958-4367-a8c0-13066a05cd11 nodeName:}" failed. No retries permitted until 2025-12-11 05:31:04.539246163 +0000 UTC m=+966.956592861 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift") pod "swift-storage-0" (UID: "612f2afd-9958-4367-a8c0-13066a05cd11") : configmap "swift-ring-files" not found Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.590408 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xv5gk"] Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.607920 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-xv5gk"] Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.618377 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-cdwch" podStartSLOduration=5.618360459 podStartE2EDuration="5.618360459s" podCreationTimestamp="2025-12-11 05:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:31:00.571150029 +0000 UTC m=+962.988496787" watchObservedRunningTime="2025-12-11 05:31:00.618360459 +0000 UTC m=+963.035707157" Dec 11 05:31:00 crc kubenswrapper[4628]: I1211 05:31:00.649076 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.156534696 podStartE2EDuration="9.649058323s" podCreationTimestamp="2025-12-11 05:30:51 +0000 UTC" firstStartedPulling="2025-12-11 05:30:53.043974395 +0000 UTC m=+955.461321093" lastFinishedPulling="2025-12-11 05:30:59.536498022 +0000 UTC m=+961.953844720" observedRunningTime="2025-12-11 05:31:00.607773419 +0000 UTC m=+963.025120117" watchObservedRunningTime="2025-12-11 05:31:00.649058323 +0000 UTC m=+963.066405021" Dec 11 05:31:01 crc kubenswrapper[4628]: I1211 05:31:01.427355 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:31:01 crc kubenswrapper[4628]: I1211 05:31:01.427715 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:31:01 crc kubenswrapper[4628]: I1211 05:31:01.427763 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:31:01 crc kubenswrapper[4628]: I1211 05:31:01.429286 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4024bd10762b90d0b487ed903bd8b69e2ebeac5fe50ac7d4b3037fdf7a40c2b1"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 05:31:01 crc kubenswrapper[4628]: I1211 05:31:01.429344 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://4024bd10762b90d0b487ed903bd8b69e2ebeac5fe50ac7d4b3037fdf7a40c2b1" gracePeriod=600 Dec 11 05:31:01 crc kubenswrapper[4628]: I1211 05:31:01.519663 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" event={"ID":"69514995-7ac6-4dea-b519-317e80b5f9fd","Type":"ContainerStarted","Data":"a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b"} Dec 11 05:31:01 crc kubenswrapper[4628]: I1211 05:31:01.520705 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:31:01 crc kubenswrapper[4628]: I1211 05:31:01.545283 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" podStartSLOduration=10.545266841 podStartE2EDuration="10.545266841s" podCreationTimestamp="2025-12-11 05:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:31:01.539361935 +0000 UTC m=+963.956708633" watchObservedRunningTime="2025-12-11 05:31:01.545266841 +0000 UTC m=+963.962613539" Dec 11 05:31:01 crc kubenswrapper[4628]: I1211 05:31:01.901037 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa3d172-61ed-43ef-b6c6-cddc2c80565b" path="/var/lib/kubelet/pods/4aa3d172-61ed-43ef-b6c6-cddc2c80565b/volumes" Dec 11 05:31:02 crc kubenswrapper[4628]: I1211 05:31:02.528679 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="4024bd10762b90d0b487ed903bd8b69e2ebeac5fe50ac7d4b3037fdf7a40c2b1" exitCode=0 Dec 11 05:31:02 crc kubenswrapper[4628]: I1211 05:31:02.528771 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"4024bd10762b90d0b487ed903bd8b69e2ebeac5fe50ac7d4b3037fdf7a40c2b1"} Dec 11 05:31:02 crc kubenswrapper[4628]: I1211 05:31:02.528830 4628 scope.go:117] "RemoveContainer" containerID="2edbf0424a7d52e635507a6262c52a38d0cf51657fa8d3615985b25f98b6c93c" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.004208 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a5a-account-create-update-qmvjk" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.032831 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k98qz" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.040966 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ac9c-account-create-update-9jpmx" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.072767 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c49b-account-create-update-d7hdm" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.095744 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mzcb5" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.098030 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08120cb0-d26e-4520-a972-829bde3491dc-operator-scripts\") pod \"08120cb0-d26e-4520-a972-829bde3491dc\" (UID: \"08120cb0-d26e-4520-a972-829bde3491dc\") " Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.098182 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m22p9\" (UniqueName: \"kubernetes.io/projected/08120cb0-d26e-4520-a972-829bde3491dc-kube-api-access-m22p9\") pod \"08120cb0-d26e-4520-a972-829bde3491dc\" (UID: \"08120cb0-d26e-4520-a972-829bde3491dc\") " Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.098262 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvckd\" (UniqueName: \"kubernetes.io/projected/a41b81d7-fa6c-4daa-b06d-df1105c0e566-kube-api-access-nvckd\") pod \"a41b81d7-fa6c-4daa-b06d-df1105c0e566\" (UID: \"a41b81d7-fa6c-4daa-b06d-df1105c0e566\") " Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.098287 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm22m\" (UniqueName: \"kubernetes.io/projected/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f-kube-api-access-xm22m\") pod \"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f\" (UID: \"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f\") " Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.098306 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f-operator-scripts\") pod \"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f\" (UID: \"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f\") " Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.098321 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a41b81d7-fa6c-4daa-b06d-df1105c0e566-operator-scripts\") pod \"a41b81d7-fa6c-4daa-b06d-df1105c0e566\" (UID: \"a41b81d7-fa6c-4daa-b06d-df1105c0e566\") " Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.099838 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f" (UID: "6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.100309 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41b81d7-fa6c-4daa-b06d-df1105c0e566-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a41b81d7-fa6c-4daa-b06d-df1105c0e566" (UID: "a41b81d7-fa6c-4daa-b06d-df1105c0e566"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.105665 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41b81d7-fa6c-4daa-b06d-df1105c0e566-kube-api-access-nvckd" (OuterVolumeSpecName: "kube-api-access-nvckd") pod "a41b81d7-fa6c-4daa-b06d-df1105c0e566" (UID: "a41b81d7-fa6c-4daa-b06d-df1105c0e566"). InnerVolumeSpecName "kube-api-access-nvckd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.106217 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-24c8x" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.106641 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08120cb0-d26e-4520-a972-829bde3491dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08120cb0-d26e-4520-a972-829bde3491dc" (UID: "08120cb0-d26e-4520-a972-829bde3491dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.112255 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvckd\" (UniqueName: \"kubernetes.io/projected/a41b81d7-fa6c-4daa-b06d-df1105c0e566-kube-api-access-nvckd\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.112283 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.112292 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a41b81d7-fa6c-4daa-b06d-df1105c0e566-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.112301 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08120cb0-d26e-4520-a972-829bde3491dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.128112 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08120cb0-d26e-4520-a972-829bde3491dc-kube-api-access-m22p9" (OuterVolumeSpecName: "kube-api-access-m22p9") pod "08120cb0-d26e-4520-a972-829bde3491dc" (UID: "08120cb0-d26e-4520-a972-829bde3491dc"). InnerVolumeSpecName "kube-api-access-m22p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.130356 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f-kube-api-access-xm22m" (OuterVolumeSpecName: "kube-api-access-xm22m") pod "6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f" (UID: "6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f"). InnerVolumeSpecName "kube-api-access-xm22m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.213088 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1318bf7e-ab46-425f-b121-4423d0623af6-operator-scripts\") pod \"1318bf7e-ab46-425f-b121-4423d0623af6\" (UID: \"1318bf7e-ab46-425f-b121-4423d0623af6\") " Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.213177 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lkcc\" (UniqueName: \"kubernetes.io/projected/300bbc9f-12f3-4da2-bb3f-85a458b574cf-kube-api-access-2lkcc\") pod \"300bbc9f-12f3-4da2-bb3f-85a458b574cf\" (UID: \"300bbc9f-12f3-4da2-bb3f-85a458b574cf\") " Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.213199 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckfbd\" (UniqueName: \"kubernetes.io/projected/f1ede4d9-f3c1-4e13-948d-abd83adb1397-kube-api-access-ckfbd\") pod \"f1ede4d9-f3c1-4e13-948d-abd83adb1397\" (UID: \"f1ede4d9-f3c1-4e13-948d-abd83adb1397\") " Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.213274 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/300bbc9f-12f3-4da2-bb3f-85a458b574cf-operator-scripts\") pod \"300bbc9f-12f3-4da2-bb3f-85a458b574cf\" (UID: \"300bbc9f-12f3-4da2-bb3f-85a458b574cf\") " Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.213295 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ede4d9-f3c1-4e13-948d-abd83adb1397-operator-scripts\") pod \"f1ede4d9-f3c1-4e13-948d-abd83adb1397\" (UID: \"f1ede4d9-f3c1-4e13-948d-abd83adb1397\") " Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.213333 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25fln\" (UniqueName: \"kubernetes.io/projected/1318bf7e-ab46-425f-b121-4423d0623af6-kube-api-access-25fln\") pod \"1318bf7e-ab46-425f-b121-4423d0623af6\" (UID: \"1318bf7e-ab46-425f-b121-4423d0623af6\") " Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.213663 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m22p9\" (UniqueName: \"kubernetes.io/projected/08120cb0-d26e-4520-a972-829bde3491dc-kube-api-access-m22p9\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.213681 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm22m\" (UniqueName: \"kubernetes.io/projected/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f-kube-api-access-xm22m\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.214541 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1ede4d9-f3c1-4e13-948d-abd83adb1397-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1ede4d9-f3c1-4e13-948d-abd83adb1397" (UID: "f1ede4d9-f3c1-4e13-948d-abd83adb1397"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.214650 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1318bf7e-ab46-425f-b121-4423d0623af6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1318bf7e-ab46-425f-b121-4423d0623af6" (UID: "1318bf7e-ab46-425f-b121-4423d0623af6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.215003 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/300bbc9f-12f3-4da2-bb3f-85a458b574cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "300bbc9f-12f3-4da2-bb3f-85a458b574cf" (UID: "300bbc9f-12f3-4da2-bb3f-85a458b574cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.218759 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ede4d9-f3c1-4e13-948d-abd83adb1397-kube-api-access-ckfbd" (OuterVolumeSpecName: "kube-api-access-ckfbd") pod "f1ede4d9-f3c1-4e13-948d-abd83adb1397" (UID: "f1ede4d9-f3c1-4e13-948d-abd83adb1397"). InnerVolumeSpecName "kube-api-access-ckfbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.218797 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1318bf7e-ab46-425f-b121-4423d0623af6-kube-api-access-25fln" (OuterVolumeSpecName: "kube-api-access-25fln") pod "1318bf7e-ab46-425f-b121-4423d0623af6" (UID: "1318bf7e-ab46-425f-b121-4423d0623af6"). InnerVolumeSpecName "kube-api-access-25fln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.233997 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300bbc9f-12f3-4da2-bb3f-85a458b574cf-kube-api-access-2lkcc" (OuterVolumeSpecName: "kube-api-access-2lkcc") pod "300bbc9f-12f3-4da2-bb3f-85a458b574cf" (UID: "300bbc9f-12f3-4da2-bb3f-85a458b574cf"). InnerVolumeSpecName "kube-api-access-2lkcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.315174 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/300bbc9f-12f3-4da2-bb3f-85a458b574cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.315202 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ede4d9-f3c1-4e13-948d-abd83adb1397-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.315213 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25fln\" (UniqueName: \"kubernetes.io/projected/1318bf7e-ab46-425f-b121-4423d0623af6-kube-api-access-25fln\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.315223 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1318bf7e-ab46-425f-b121-4423d0623af6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.315231 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lkcc\" (UniqueName: \"kubernetes.io/projected/300bbc9f-12f3-4da2-bb3f-85a458b574cf-kube-api-access-2lkcc\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.315239 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckfbd\" (UniqueName: \"kubernetes.io/projected/f1ede4d9-f3c1-4e13-948d-abd83adb1397-kube-api-access-ckfbd\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.537315 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"6ecc8b439306d6103b7fabe922fa79c181a14fe03fbd4c6f00b4023e3934e67c"} Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.540555 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2a5a-account-create-update-qmvjk" event={"ID":"08120cb0-d26e-4520-a972-829bde3491dc","Type":"ContainerDied","Data":"29b2e5fa9e8cb4d0ec4a0fde4c1726a089c613749c163c4bf350356651371bde"} Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.540682 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29b2e5fa9e8cb4d0ec4a0fde4c1726a089c613749c163c4bf350356651371bde" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.540785 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2a5a-account-create-update-qmvjk" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.544212 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k98qz" event={"ID":"6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f","Type":"ContainerDied","Data":"222c5d1b70f0f09aeb4bc3fa0c04f30219b0c569f84c2824ece1bd0ff755677a"} Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.544239 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="222c5d1b70f0f09aeb4bc3fa0c04f30219b0c569f84c2824ece1bd0ff755677a" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.544276 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k98qz" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.558634 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ac9c-account-create-update-9jpmx" event={"ID":"a41b81d7-fa6c-4daa-b06d-df1105c0e566","Type":"ContainerDied","Data":"ab177afa97e4e404dc97276c0081c5b66386c4e4a87dc994439a808b9e91784c"} Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.558675 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab177afa97e4e404dc97276c0081c5b66386c4e4a87dc994439a808b9e91784c" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.558699 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ac9c-account-create-update-9jpmx" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.561806 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-t4rhc" event={"ID":"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1","Type":"ContainerStarted","Data":"b9e9128e46cd2edb45f5e538a69d47cd5e0ccc920eca3d955343c2f26ae59cab"} Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.564251 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-24c8x" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.565949 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mzcb5" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.564235 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-24c8x" event={"ID":"300bbc9f-12f3-4da2-bb3f-85a458b574cf","Type":"ContainerDied","Data":"a3e1047c35f629dee104c67f18b957f4a8baa93515290d2419eb06c28565df0b"} Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.567626 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3e1047c35f629dee104c67f18b957f4a8baa93515290d2419eb06c28565df0b" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.567652 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mzcb5" event={"ID":"1318bf7e-ab46-425f-b121-4423d0623af6","Type":"ContainerDied","Data":"fa18d93ea9517f950c38b835cc5537f4094425ede84248595227429a90f33e85"} Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.567671 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa18d93ea9517f950c38b835cc5537f4094425ede84248595227429a90f33e85" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.576782 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c49b-account-create-update-d7hdm" event={"ID":"f1ede4d9-f3c1-4e13-948d-abd83adb1397","Type":"ContainerDied","Data":"37e6cff88ecea61add26e5db05546514469e1d1a128a25f762a55d17929bb438"} Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.576838 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37e6cff88ecea61add26e5db05546514469e1d1a128a25f762a55d17929bb438" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.576876 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c49b-account-create-update-d7hdm" Dec 11 05:31:03 crc kubenswrapper[4628]: I1211 05:31:03.595915 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-t4rhc" podStartSLOduration=2.85248034 podStartE2EDuration="6.595896349s" podCreationTimestamp="2025-12-11 05:30:57 +0000 UTC" firstStartedPulling="2025-12-11 05:30:59.138764706 +0000 UTC m=+961.556111404" lastFinishedPulling="2025-12-11 05:31:02.882180715 +0000 UTC m=+965.299527413" observedRunningTime="2025-12-11 05:31:03.591176335 +0000 UTC m=+966.008523083" watchObservedRunningTime="2025-12-11 05:31:03.595896349 +0000 UTC m=+966.013243047" Dec 11 05:31:04 crc kubenswrapper[4628]: I1211 05:31:04.640104 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:31:04 crc kubenswrapper[4628]: E1211 05:31:04.640284 4628 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 11 05:31:04 crc kubenswrapper[4628]: E1211 05:31:04.640462 4628 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 11 05:31:04 crc kubenswrapper[4628]: E1211 05:31:04.640533 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift podName:612f2afd-9958-4367-a8c0-13066a05cd11 nodeName:}" failed. No retries permitted until 2025-12-11 05:31:12.640500849 +0000 UTC m=+975.057847547 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift") pod "swift-storage-0" (UID: "612f2afd-9958-4367-a8c0-13066a05cd11") : configmap "swift-ring-files" not found Dec 11 05:31:05 crc kubenswrapper[4628]: I1211 05:31:05.872023 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:31:05 crc kubenswrapper[4628]: I1211 05:31:05.980157 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-j4d2g"] Dec 11 05:31:05 crc kubenswrapper[4628]: I1211 05:31:05.980440 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" podUID="69514995-7ac6-4dea-b519-317e80b5f9fd" containerName="dnsmasq-dns" containerID="cri-o://a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b" gracePeriod=10 Dec 11 05:31:05 crc kubenswrapper[4628]: I1211 05:31:05.982010 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.419618 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.583555 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-config\") pod \"69514995-7ac6-4dea-b519-317e80b5f9fd\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.583610 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxz29\" (UniqueName: \"kubernetes.io/projected/69514995-7ac6-4dea-b519-317e80b5f9fd-kube-api-access-gxz29\") pod \"69514995-7ac6-4dea-b519-317e80b5f9fd\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.583652 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-ovsdbserver-nb\") pod \"69514995-7ac6-4dea-b519-317e80b5f9fd\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.583687 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-ovsdbserver-sb\") pod \"69514995-7ac6-4dea-b519-317e80b5f9fd\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.583733 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-dns-svc\") pod \"69514995-7ac6-4dea-b519-317e80b5f9fd\" (UID: \"69514995-7ac6-4dea-b519-317e80b5f9fd\") " Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.589521 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69514995-7ac6-4dea-b519-317e80b5f9fd-kube-api-access-gxz29" (OuterVolumeSpecName: "kube-api-access-gxz29") pod "69514995-7ac6-4dea-b519-317e80b5f9fd" (UID: "69514995-7ac6-4dea-b519-317e80b5f9fd"). InnerVolumeSpecName "kube-api-access-gxz29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.608929 4628 generic.go:334] "Generic (PLEG): container finished" podID="69514995-7ac6-4dea-b519-317e80b5f9fd" containerID="a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b" exitCode=0 Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.608984 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" event={"ID":"69514995-7ac6-4dea-b519-317e80b5f9fd","Type":"ContainerDied","Data":"a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b"} Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.609011 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" event={"ID":"69514995-7ac6-4dea-b519-317e80b5f9fd","Type":"ContainerDied","Data":"7acdd4c67e95f9d9d548437fb0fe038f35d4fae3ddb3224ac8b34bcfad19e922"} Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.609027 4628 scope.go:117] "RemoveContainer" containerID="a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.609164 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-j4d2g" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.636889 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69514995-7ac6-4dea-b519-317e80b5f9fd" (UID: "69514995-7ac6-4dea-b519-317e80b5f9fd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.637988 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-config" (OuterVolumeSpecName: "config") pod "69514995-7ac6-4dea-b519-317e80b5f9fd" (UID: "69514995-7ac6-4dea-b519-317e80b5f9fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.651307 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69514995-7ac6-4dea-b519-317e80b5f9fd" (UID: "69514995-7ac6-4dea-b519-317e80b5f9fd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.659618 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69514995-7ac6-4dea-b519-317e80b5f9fd" (UID: "69514995-7ac6-4dea-b519-317e80b5f9fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.661710 4628 scope.go:117] "RemoveContainer" containerID="6026922b70aa95009023ff47b8bba02a48e5ead8ee279bc6882f2ae6cf6cc5e8" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.683384 4628 scope.go:117] "RemoveContainer" containerID="a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b" Dec 11 05:31:06 crc kubenswrapper[4628]: E1211 05:31:06.684014 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b\": container with ID starting with a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b not found: ID does not exist" containerID="a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.684058 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b"} err="failed to get container status \"a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b\": rpc error: code = NotFound desc = could not find container \"a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b\": container with ID starting with a59d3a81aff1260bc150430f51ef458bdd872ed471d26b29da6265800c5ad75b not found: ID does not exist" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.684107 4628 scope.go:117] "RemoveContainer" containerID="6026922b70aa95009023ff47b8bba02a48e5ead8ee279bc6882f2ae6cf6cc5e8" Dec 11 05:31:06 crc kubenswrapper[4628]: E1211 05:31:06.684407 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6026922b70aa95009023ff47b8bba02a48e5ead8ee279bc6882f2ae6cf6cc5e8\": container with ID starting with 6026922b70aa95009023ff47b8bba02a48e5ead8ee279bc6882f2ae6cf6cc5e8 not found: ID does not exist" containerID="6026922b70aa95009023ff47b8bba02a48e5ead8ee279bc6882f2ae6cf6cc5e8" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.684447 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6026922b70aa95009023ff47b8bba02a48e5ead8ee279bc6882f2ae6cf6cc5e8"} err="failed to get container status \"6026922b70aa95009023ff47b8bba02a48e5ead8ee279bc6882f2ae6cf6cc5e8\": rpc error: code = NotFound desc = could not find container \"6026922b70aa95009023ff47b8bba02a48e5ead8ee279bc6882f2ae6cf6cc5e8\": container with ID starting with 6026922b70aa95009023ff47b8bba02a48e5ead8ee279bc6882f2ae6cf6cc5e8 not found: ID does not exist" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.685017 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.685040 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxz29\" (UniqueName: \"kubernetes.io/projected/69514995-7ac6-4dea-b519-317e80b5f9fd-kube-api-access-gxz29\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.685051 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.685062 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.685070 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69514995-7ac6-4dea-b519-317e80b5f9fd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.939807 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-j4d2g"] Dec 11 05:31:06 crc kubenswrapper[4628]: I1211 05:31:06.951937 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-j4d2g"] Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.145445 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ksb9p"] Dec 11 05:31:07 crc kubenswrapper[4628]: E1211 05:31:07.145737 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa3d172-61ed-43ef-b6c6-cddc2c80565b" containerName="init" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.145751 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa3d172-61ed-43ef-b6c6-cddc2c80565b" containerName="init" Dec 11 05:31:07 crc kubenswrapper[4628]: E1211 05:31:07.145764 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72259b41-2e95-4531-a2ff-2939e437253c" containerName="dnsmasq-dns" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.145771 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="72259b41-2e95-4531-a2ff-2939e437253c" containerName="dnsmasq-dns" Dec 11 05:31:07 crc kubenswrapper[4628]: E1211 05:31:07.145779 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ede4d9-f3c1-4e13-948d-abd83adb1397" containerName="mariadb-account-create-update" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.145786 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ede4d9-f3c1-4e13-948d-abd83adb1397" containerName="mariadb-account-create-update" Dec 11 05:31:07 crc kubenswrapper[4628]: E1211 05:31:07.145800 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69514995-7ac6-4dea-b519-317e80b5f9fd" containerName="init" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.145805 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="69514995-7ac6-4dea-b519-317e80b5f9fd" containerName="init" Dec 11 05:31:07 crc kubenswrapper[4628]: E1211 05:31:07.145815 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300bbc9f-12f3-4da2-bb3f-85a458b574cf" containerName="mariadb-database-create" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.145820 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="300bbc9f-12f3-4da2-bb3f-85a458b574cf" containerName="mariadb-database-create" Dec 11 05:31:07 crc kubenswrapper[4628]: E1211 05:31:07.145830 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1318bf7e-ab46-425f-b121-4423d0623af6" containerName="mariadb-database-create" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.145836 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="1318bf7e-ab46-425f-b121-4423d0623af6" containerName="mariadb-database-create" Dec 11 05:31:07 crc kubenswrapper[4628]: E1211 05:31:07.145862 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08120cb0-d26e-4520-a972-829bde3491dc" containerName="mariadb-account-create-update" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.145869 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="08120cb0-d26e-4520-a972-829bde3491dc" containerName="mariadb-account-create-update" Dec 11 05:31:07 crc kubenswrapper[4628]: E1211 05:31:07.145877 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69514995-7ac6-4dea-b519-317e80b5f9fd" containerName="dnsmasq-dns" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.145884 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="69514995-7ac6-4dea-b519-317e80b5f9fd" containerName="dnsmasq-dns" Dec 11 05:31:07 crc kubenswrapper[4628]: E1211 05:31:07.145894 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f" containerName="mariadb-database-create" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.145900 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f" containerName="mariadb-database-create" Dec 11 05:31:07 crc kubenswrapper[4628]: E1211 05:31:07.145911 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72259b41-2e95-4531-a2ff-2939e437253c" containerName="init" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.145917 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="72259b41-2e95-4531-a2ff-2939e437253c" containerName="init" Dec 11 05:31:07 crc kubenswrapper[4628]: E1211 05:31:07.145927 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41b81d7-fa6c-4daa-b06d-df1105c0e566" containerName="mariadb-account-create-update" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.145933 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41b81d7-fa6c-4daa-b06d-df1105c0e566" containerName="mariadb-account-create-update" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.146078 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f" containerName="mariadb-database-create" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.146089 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="1318bf7e-ab46-425f-b121-4423d0623af6" containerName="mariadb-database-create" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.146097 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="300bbc9f-12f3-4da2-bb3f-85a458b574cf" containerName="mariadb-database-create" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.146104 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="72259b41-2e95-4531-a2ff-2939e437253c" containerName="dnsmasq-dns" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.146111 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="69514995-7ac6-4dea-b519-317e80b5f9fd" containerName="dnsmasq-dns" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.146122 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ede4d9-f3c1-4e13-948d-abd83adb1397" containerName="mariadb-account-create-update" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.146132 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="08120cb0-d26e-4520-a972-829bde3491dc" containerName="mariadb-account-create-update" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.146143 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="a41b81d7-fa6c-4daa-b06d-df1105c0e566" containerName="mariadb-account-create-update" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.146153 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa3d172-61ed-43ef-b6c6-cddc2c80565b" containerName="init" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.146647 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.149269 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-92zmr" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.156681 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.163635 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ksb9p"] Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.294080 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-config-data\") pod \"glance-db-sync-ksb9p\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.294204 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fjdz\" (UniqueName: \"kubernetes.io/projected/98c407a5-95e8-4036-becd-3522286435d5-kube-api-access-2fjdz\") pod \"glance-db-sync-ksb9p\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.294294 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-db-sync-config-data\") pod \"glance-db-sync-ksb9p\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.294326 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-combined-ca-bundle\") pod \"glance-db-sync-ksb9p\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.395588 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-db-sync-config-data\") pod \"glance-db-sync-ksb9p\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.395830 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-combined-ca-bundle\") pod \"glance-db-sync-ksb9p\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.395978 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-config-data\") pod \"glance-db-sync-ksb9p\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.396106 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fjdz\" (UniqueName: \"kubernetes.io/projected/98c407a5-95e8-4036-becd-3522286435d5-kube-api-access-2fjdz\") pod \"glance-db-sync-ksb9p\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.399576 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-db-sync-config-data\") pod \"glance-db-sync-ksb9p\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.400143 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-combined-ca-bundle\") pod \"glance-db-sync-ksb9p\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.400580 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-config-data\") pod \"glance-db-sync-ksb9p\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.419265 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fjdz\" (UniqueName: \"kubernetes.io/projected/98c407a5-95e8-4036-becd-3522286435d5-kube-api-access-2fjdz\") pod \"glance-db-sync-ksb9p\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.464001 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ksb9p" Dec 11 05:31:07 crc kubenswrapper[4628]: I1211 05:31:07.899266 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69514995-7ac6-4dea-b519-317e80b5f9fd" path="/var/lib/kubelet/pods/69514995-7ac6-4dea-b519-317e80b5f9fd/volumes" Dec 11 05:31:08 crc kubenswrapper[4628]: I1211 05:31:08.011371 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ksb9p"] Dec 11 05:31:08 crc kubenswrapper[4628]: W1211 05:31:08.020951 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98c407a5_95e8_4036_becd_3522286435d5.slice/crio-b1a91a3e8c4f23beca2e685d8f0c55b214db6311f0fd290742cb8867f6d2e615 WatchSource:0}: Error finding container b1a91a3e8c4f23beca2e685d8f0c55b214db6311f0fd290742cb8867f6d2e615: Status 404 returned error can't find the container with id b1a91a3e8c4f23beca2e685d8f0c55b214db6311f0fd290742cb8867f6d2e615 Dec 11 05:31:08 crc kubenswrapper[4628]: I1211 05:31:08.633635 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ksb9p" event={"ID":"98c407a5-95e8-4036-becd-3522286435d5","Type":"ContainerStarted","Data":"b1a91a3e8c4f23beca2e685d8f0c55b214db6311f0fd290742cb8867f6d2e615"} Dec 11 05:31:10 crc kubenswrapper[4628]: I1211 05:31:10.663737 4628 generic.go:334] "Generic (PLEG): container finished" podID="5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1" containerID="b9e9128e46cd2edb45f5e538a69d47cd5e0ccc920eca3d955343c2f26ae59cab" exitCode=0 Dec 11 05:31:10 crc kubenswrapper[4628]: I1211 05:31:10.663830 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-t4rhc" event={"ID":"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1","Type":"ContainerDied","Data":"b9e9128e46cd2edb45f5e538a69d47cd5e0ccc920eca3d955343c2f26ae59cab"} Dec 11 05:31:11 crc kubenswrapper[4628]: I1211 05:31:11.677382 4628 generic.go:334] "Generic (PLEG): container finished" podID="5279d32c-7625-460c-881b-243e69077070" containerID="86cdb42df246a58a1bcb275c5570adfa9b9a943b1d21a98085ada9bb6063ed40" exitCode=0 Dec 11 05:31:11 crc kubenswrapper[4628]: I1211 05:31:11.677490 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5279d32c-7625-460c-881b-243e69077070","Type":"ContainerDied","Data":"86cdb42df246a58a1bcb275c5570adfa9b9a943b1d21a98085ada9bb6063ed40"} Dec 11 05:31:11 crc kubenswrapper[4628]: I1211 05:31:11.680089 4628 generic.go:334] "Generic (PLEG): container finished" podID="a07218df-1f25-47c4-89dc-2c7ce7f406ac" containerID="fbb2f6ff2b4cf940c6af7bddcc8de8efd9bce8d4d8220bccb23d4c9966e7b818" exitCode=0 Dec 11 05:31:11 crc kubenswrapper[4628]: I1211 05:31:11.680288 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a07218df-1f25-47c4-89dc-2c7ce7f406ac","Type":"ContainerDied","Data":"fbb2f6ff2b4cf940c6af7bddcc8de8efd9bce8d4d8220bccb23d4c9966e7b818"} Dec 11 05:31:11 crc kubenswrapper[4628]: I1211 05:31:11.987232 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.110242 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.121815 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-dispersionconf\") pod \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.121922 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-ring-data-devices\") pod \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.122041 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-swiftconf\") pod \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.122080 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-combined-ca-bundle\") pod \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.122147 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjlvh\" (UniqueName: \"kubernetes.io/projected/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-kube-api-access-bjlvh\") pod \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.122181 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-etc-swift\") pod \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.122195 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-scripts\") pod \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\" (UID: \"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1\") " Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.123238 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1" (UID: "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.132016 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-kube-api-access-bjlvh" (OuterVolumeSpecName: "kube-api-access-bjlvh") pod "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1" (UID: "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1"). InnerVolumeSpecName "kube-api-access-bjlvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.133306 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1" (UID: "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.136384 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1" (UID: "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.184010 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1" (UID: "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.191096 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1" (UID: "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.207524 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-scripts" (OuterVolumeSpecName: "scripts") pod "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1" (UID: "5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.223777 4628 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.223810 4628 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.223820 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.223829 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjlvh\" (UniqueName: \"kubernetes.io/projected/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-kube-api-access-bjlvh\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.223840 4628 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.223888 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.223898 4628 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.689890 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5279d32c-7625-460c-881b-243e69077070","Type":"ContainerStarted","Data":"866489e3cb37534a9f3a2f24ee29b00df3ac27c7ba0aba4f75e188125089cd72"} Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.690095 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.692175 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-t4rhc" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.692166 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-t4rhc" event={"ID":"5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1","Type":"ContainerDied","Data":"62afa1516216500faffb04826e3605338be9104e3f863e13454e4b4f5bd17168"} Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.692329 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62afa1516216500faffb04826e3605338be9104e3f863e13454e4b4f5bd17168" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.704156 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a07218df-1f25-47c4-89dc-2c7ce7f406ac","Type":"ContainerStarted","Data":"18ac8dd9c99eb617ae753c989348e0de63c5c48434de8d8b5cdd717035157858"} Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.704418 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.732211 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.738253 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612f2afd-9958-4367-a8c0-13066a05cd11-etc-swift\") pod \"swift-storage-0\" (UID: \"612f2afd-9958-4367-a8c0-13066a05cd11\") " pod="openstack/swift-storage-0" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.747596 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371970.107218 podStartE2EDuration="1m6.747558647s" podCreationTimestamp="2025-12-11 05:30:06 +0000 UTC" firstStartedPulling="2025-12-11 05:30:08.903917066 +0000 UTC m=+911.321263764" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:31:12.734156617 +0000 UTC m=+975.151503315" watchObservedRunningTime="2025-12-11 05:31:12.747558647 +0000 UTC m=+975.164905345" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.833611 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.254498781 podStartE2EDuration="1m5.83358654s" podCreationTimestamp="2025-12-11 05:30:07 +0000 UTC" firstStartedPulling="2025-12-11 05:30:09.246592524 +0000 UTC m=+911.663939222" lastFinishedPulling="2025-12-11 05:30:37.825680283 +0000 UTC m=+940.243026981" observedRunningTime="2025-12-11 05:31:12.818391011 +0000 UTC m=+975.235737719" watchObservedRunningTime="2025-12-11 05:31:12.83358654 +0000 UTC m=+975.250933238" Dec 11 05:31:12 crc kubenswrapper[4628]: I1211 05:31:12.918232 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 11 05:31:13 crc kubenswrapper[4628]: I1211 05:31:13.685331 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 11 05:31:13 crc kubenswrapper[4628]: W1211 05:31:13.705415 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod612f2afd_9958_4367_a8c0_13066a05cd11.slice/crio-b5f272be58c86b87885e762711f944ba0d113deef426df7e1f79f6db82187c6e WatchSource:0}: Error finding container b5f272be58c86b87885e762711f944ba0d113deef426df7e1f79f6db82187c6e: Status 404 returned error can't find the container with id b5f272be58c86b87885e762711f944ba0d113deef426df7e1f79f6db82187c6e Dec 11 05:31:14 crc kubenswrapper[4628]: I1211 05:31:14.722822 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"b5f272be58c86b87885e762711f944ba0d113deef426df7e1f79f6db82187c6e"} Dec 11 05:31:17 crc kubenswrapper[4628]: I1211 05:31:17.925591 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:31:17 crc kubenswrapper[4628]: I1211 05:31:17.982428 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qz7fr" podUID="d0885fe4-936a-4a13-b4e5-4aeee593c242" containerName="ovn-controller" probeResult="failure" output=< Dec 11 05:31:17 crc kubenswrapper[4628]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 11 05:31:17 crc kubenswrapper[4628]: > Dec 11 05:31:17 crc kubenswrapper[4628]: I1211 05:31:17.992433 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gncbg" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.201451 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qz7fr-config-4dgdw"] Dec 11 05:31:18 crc kubenswrapper[4628]: E1211 05:31:18.202054 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1" containerName="swift-ring-rebalance" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.202067 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1" containerName="swift-ring-rebalance" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.202226 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1" containerName="swift-ring-rebalance" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.202705 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.206886 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.226981 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qz7fr-config-4dgdw"] Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.262436 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-run\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.262490 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b58a681-b228-4f2b-aa99-ea7777c4332c-additional-scripts\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.262529 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnqdb\" (UniqueName: \"kubernetes.io/projected/8b58a681-b228-4f2b-aa99-ea7777c4332c-kube-api-access-rnqdb\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.262582 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-log-ovn\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.262604 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-run-ovn\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.262628 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b58a681-b228-4f2b-aa99-ea7777c4332c-scripts\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.363808 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-run-ovn\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.363873 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b58a681-b228-4f2b-aa99-ea7777c4332c-scripts\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.363911 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-run\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.363953 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b58a681-b228-4f2b-aa99-ea7777c4332c-additional-scripts\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.363989 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnqdb\" (UniqueName: \"kubernetes.io/projected/8b58a681-b228-4f2b-aa99-ea7777c4332c-kube-api-access-rnqdb\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.364057 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-log-ovn\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.364191 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-run-ovn\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.364271 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-log-ovn\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.364204 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-run\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.364971 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b58a681-b228-4f2b-aa99-ea7777c4332c-additional-scripts\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.366010 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b58a681-b228-4f2b-aa99-ea7777c4332c-scripts\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.430519 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnqdb\" (UniqueName: \"kubernetes.io/projected/8b58a681-b228-4f2b-aa99-ea7777c4332c-kube-api-access-rnqdb\") pod \"ovn-controller-qz7fr-config-4dgdw\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:18 crc kubenswrapper[4628]: I1211 05:31:18.547174 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:22 crc kubenswrapper[4628]: I1211 05:31:22.918029 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qz7fr" podUID="d0885fe4-936a-4a13-b4e5-4aeee593c242" containerName="ovn-controller" probeResult="failure" output=< Dec 11 05:31:22 crc kubenswrapper[4628]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 11 05:31:22 crc kubenswrapper[4628]: > Dec 11 05:31:24 crc kubenswrapper[4628]: E1211 05:31:24.269681 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 11 05:31:24 crc kubenswrapper[4628]: E1211 05:31:24.270213 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fjdz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-ksb9p_openstack(98c407a5-95e8-4036-becd-3522286435d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:31:24 crc kubenswrapper[4628]: E1211 05:31:24.271960 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-ksb9p" podUID="98c407a5-95e8-4036-becd-3522286435d5" Dec 11 05:31:24 crc kubenswrapper[4628]: I1211 05:31:24.550068 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qz7fr-config-4dgdw"] Dec 11 05:31:24 crc kubenswrapper[4628]: W1211 05:31:24.564114 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b58a681_b228_4f2b_aa99_ea7777c4332c.slice/crio-dca66237a021f54d8d260e2b5fbdaa9e8b4d19fe6238b3096c740b774098e56b WatchSource:0}: Error finding container dca66237a021f54d8d260e2b5fbdaa9e8b4d19fe6238b3096c740b774098e56b: Status 404 returned error can't find the container with id dca66237a021f54d8d260e2b5fbdaa9e8b4d19fe6238b3096c740b774098e56b Dec 11 05:31:24 crc kubenswrapper[4628]: I1211 05:31:24.805809 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qz7fr-config-4dgdw" event={"ID":"8b58a681-b228-4f2b-aa99-ea7777c4332c","Type":"ContainerStarted","Data":"dca66237a021f54d8d260e2b5fbdaa9e8b4d19fe6238b3096c740b774098e56b"} Dec 11 05:31:24 crc kubenswrapper[4628]: E1211 05:31:24.838061 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-ksb9p" podUID="98c407a5-95e8-4036-becd-3522286435d5" Dec 11 05:31:25 crc kubenswrapper[4628]: I1211 05:31:25.815325 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"04f8c05e7b02f141ab6e0da3e860b3db181b8afe8537d02d6e2e88e0fc2e5189"} Dec 11 05:31:25 crc kubenswrapper[4628]: I1211 05:31:25.815681 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"080d618fdda41e10c45d5cef07f7952114612ba262601940aaa6d97626efef15"} Dec 11 05:31:25 crc kubenswrapper[4628]: I1211 05:31:25.815696 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"28c5fc80197ee2282fdd8a8082053dccfe7b50e772a6d35c0cd225755f49508e"} Dec 11 05:31:25 crc kubenswrapper[4628]: I1211 05:31:25.815727 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"681f9417e3e6199ab57492163c0830b67b79c36fd4484f540f151c8358d462ca"} Dec 11 05:31:25 crc kubenswrapper[4628]: I1211 05:31:25.817226 4628 generic.go:334] "Generic (PLEG): container finished" podID="8b58a681-b228-4f2b-aa99-ea7777c4332c" containerID="ecd73a76204a4451e2d66df534ea5eab739cbcf64c836b3f7c403b7b4249d88a" exitCode=0 Dec 11 05:31:25 crc kubenswrapper[4628]: I1211 05:31:25.817258 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qz7fr-config-4dgdw" event={"ID":"8b58a681-b228-4f2b-aa99-ea7777c4332c","Type":"ContainerDied","Data":"ecd73a76204a4451e2d66df534ea5eab739cbcf64c836b3f7c403b7b4249d88a"} Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.226440 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.420793 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-log-ovn\") pod \"8b58a681-b228-4f2b-aa99-ea7777c4332c\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.420984 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnqdb\" (UniqueName: \"kubernetes.io/projected/8b58a681-b228-4f2b-aa99-ea7777c4332c-kube-api-access-rnqdb\") pod \"8b58a681-b228-4f2b-aa99-ea7777c4332c\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.421031 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b58a681-b228-4f2b-aa99-ea7777c4332c-additional-scripts\") pod \"8b58a681-b228-4f2b-aa99-ea7777c4332c\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.421089 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b58a681-b228-4f2b-aa99-ea7777c4332c-scripts\") pod \"8b58a681-b228-4f2b-aa99-ea7777c4332c\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.421134 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-run\") pod \"8b58a681-b228-4f2b-aa99-ea7777c4332c\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.421208 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-run-ovn\") pod \"8b58a681-b228-4f2b-aa99-ea7777c4332c\" (UID: \"8b58a681-b228-4f2b-aa99-ea7777c4332c\") " Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.421410 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8b58a681-b228-4f2b-aa99-ea7777c4332c" (UID: "8b58a681-b228-4f2b-aa99-ea7777c4332c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.421468 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-run" (OuterVolumeSpecName: "var-run") pod "8b58a681-b228-4f2b-aa99-ea7777c4332c" (UID: "8b58a681-b228-4f2b-aa99-ea7777c4332c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.421605 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8b58a681-b228-4f2b-aa99-ea7777c4332c" (UID: "8b58a681-b228-4f2b-aa99-ea7777c4332c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.422349 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b58a681-b228-4f2b-aa99-ea7777c4332c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8b58a681-b228-4f2b-aa99-ea7777c4332c" (UID: "8b58a681-b228-4f2b-aa99-ea7777c4332c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.422976 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b58a681-b228-4f2b-aa99-ea7777c4332c-scripts" (OuterVolumeSpecName: "scripts") pod "8b58a681-b228-4f2b-aa99-ea7777c4332c" (UID: "8b58a681-b228-4f2b-aa99-ea7777c4332c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.429922 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b58a681-b228-4f2b-aa99-ea7777c4332c-kube-api-access-rnqdb" (OuterVolumeSpecName: "kube-api-access-rnqdb") pod "8b58a681-b228-4f2b-aa99-ea7777c4332c" (UID: "8b58a681-b228-4f2b-aa99-ea7777c4332c"). InnerVolumeSpecName "kube-api-access-rnqdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.523887 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnqdb\" (UniqueName: \"kubernetes.io/projected/8b58a681-b228-4f2b-aa99-ea7777c4332c-kube-api-access-rnqdb\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.523962 4628 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b58a681-b228-4f2b-aa99-ea7777c4332c-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.523991 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b58a681-b228-4f2b-aa99-ea7777c4332c-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.524029 4628 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-run\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.524054 4628 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.524076 4628 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b58a681-b228-4f2b-aa99-ea7777c4332c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.835161 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qz7fr-config-4dgdw" event={"ID":"8b58a681-b228-4f2b-aa99-ea7777c4332c","Type":"ContainerDied","Data":"dca66237a021f54d8d260e2b5fbdaa9e8b4d19fe6238b3096c740b774098e56b"} Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.835198 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dca66237a021f54d8d260e2b5fbdaa9e8b4d19fe6238b3096c740b774098e56b" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.835249 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qz7fr-config-4dgdw" Dec 11 05:31:27 crc kubenswrapper[4628]: I1211 05:31:27.921034 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-qz7fr" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.314982 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.331703 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qz7fr-config-4dgdw"] Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.340806 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qz7fr-config-4dgdw"] Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.678672 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-2pxdl"] Dec 11 05:31:28 crc kubenswrapper[4628]: E1211 05:31:28.679314 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b58a681-b228-4f2b-aa99-ea7777c4332c" containerName="ovn-config" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.679332 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b58a681-b228-4f2b-aa99-ea7777c4332c" containerName="ovn-config" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.679472 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b58a681-b228-4f2b-aa99-ea7777c4332c" containerName="ovn-config" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.680044 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2pxdl" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.699035 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2pxdl"] Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.720062 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.852939 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldtxr\" (UniqueName: \"kubernetes.io/projected/2df1b74b-f6be-41c4-b9f1-6916553ef1d9-kube-api-access-ldtxr\") pod \"barbican-db-create-2pxdl\" (UID: \"2df1b74b-f6be-41c4-b9f1-6916553ef1d9\") " pod="openstack/barbican-db-create-2pxdl" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.853092 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2df1b74b-f6be-41c4-b9f1-6916553ef1d9-operator-scripts\") pod \"barbican-db-create-2pxdl\" (UID: \"2df1b74b-f6be-41c4-b9f1-6916553ef1d9\") " pod="openstack/barbican-db-create-2pxdl" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.858660 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"680e7f6fbec470ee87308963c86321fba2fb33e7304544be583e177330ccd574"} Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.858703 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"4e6b9f5ccac532a98c52f6aba683bc4f925f0598c50f8bb8cb56c1ac3b9743b5"} Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.891861 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-f632-account-create-update-cvcfq"] Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.895172 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f632-account-create-update-cvcfq" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.899011 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.927362 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f632-account-create-update-cvcfq"] Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.954218 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cc9b\" (UniqueName: \"kubernetes.io/projected/c13e137b-1a8a-4965-8f85-04ad2b5ff488-kube-api-access-4cc9b\") pod \"barbican-f632-account-create-update-cvcfq\" (UID: \"c13e137b-1a8a-4965-8f85-04ad2b5ff488\") " pod="openstack/barbican-f632-account-create-update-cvcfq" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.954267 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2df1b74b-f6be-41c4-b9f1-6916553ef1d9-operator-scripts\") pod \"barbican-db-create-2pxdl\" (UID: \"2df1b74b-f6be-41c4-b9f1-6916553ef1d9\") " pod="openstack/barbican-db-create-2pxdl" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.954307 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13e137b-1a8a-4965-8f85-04ad2b5ff488-operator-scripts\") pod \"barbican-f632-account-create-update-cvcfq\" (UID: \"c13e137b-1a8a-4965-8f85-04ad2b5ff488\") " pod="openstack/barbican-f632-account-create-update-cvcfq" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.954512 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldtxr\" (UniqueName: \"kubernetes.io/projected/2df1b74b-f6be-41c4-b9f1-6916553ef1d9-kube-api-access-ldtxr\") pod \"barbican-db-create-2pxdl\" (UID: \"2df1b74b-f6be-41c4-b9f1-6916553ef1d9\") " pod="openstack/barbican-db-create-2pxdl" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.954912 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2df1b74b-f6be-41c4-b9f1-6916553ef1d9-operator-scripts\") pod \"barbican-db-create-2pxdl\" (UID: \"2df1b74b-f6be-41c4-b9f1-6916553ef1d9\") " pod="openstack/barbican-db-create-2pxdl" Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.990994 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-cl2fq"] Dec 11 05:31:28 crc kubenswrapper[4628]: I1211 05:31:28.992243 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cl2fq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.001837 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldtxr\" (UniqueName: \"kubernetes.io/projected/2df1b74b-f6be-41c4-b9f1-6916553ef1d9-kube-api-access-ldtxr\") pod \"barbican-db-create-2pxdl\" (UID: \"2df1b74b-f6be-41c4-b9f1-6916553ef1d9\") " pod="openstack/barbican-db-create-2pxdl" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.007033 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2pxdl" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.019412 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cl2fq"] Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.057360 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cc9b\" (UniqueName: \"kubernetes.io/projected/c13e137b-1a8a-4965-8f85-04ad2b5ff488-kube-api-access-4cc9b\") pod \"barbican-f632-account-create-update-cvcfq\" (UID: \"c13e137b-1a8a-4965-8f85-04ad2b5ff488\") " pod="openstack/barbican-f632-account-create-update-cvcfq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.057423 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11763665-9dfb-4894-94f3-1729ee24848a-operator-scripts\") pod \"cinder-db-create-cl2fq\" (UID: \"11763665-9dfb-4894-94f3-1729ee24848a\") " pod="openstack/cinder-db-create-cl2fq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.057475 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnncm\" (UniqueName: \"kubernetes.io/projected/11763665-9dfb-4894-94f3-1729ee24848a-kube-api-access-xnncm\") pod \"cinder-db-create-cl2fq\" (UID: \"11763665-9dfb-4894-94f3-1729ee24848a\") " pod="openstack/cinder-db-create-cl2fq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.057521 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13e137b-1a8a-4965-8f85-04ad2b5ff488-operator-scripts\") pod \"barbican-f632-account-create-update-cvcfq\" (UID: \"c13e137b-1a8a-4965-8f85-04ad2b5ff488\") " pod="openstack/barbican-f632-account-create-update-cvcfq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.058342 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13e137b-1a8a-4965-8f85-04ad2b5ff488-operator-scripts\") pod \"barbican-f632-account-create-update-cvcfq\" (UID: \"c13e137b-1a8a-4965-8f85-04ad2b5ff488\") " pod="openstack/barbican-f632-account-create-update-cvcfq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.119813 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cc9b\" (UniqueName: \"kubernetes.io/projected/c13e137b-1a8a-4965-8f85-04ad2b5ff488-kube-api-access-4cc9b\") pod \"barbican-f632-account-create-update-cvcfq\" (UID: \"c13e137b-1a8a-4965-8f85-04ad2b5ff488\") " pod="openstack/barbican-f632-account-create-update-cvcfq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.159721 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnncm\" (UniqueName: \"kubernetes.io/projected/11763665-9dfb-4894-94f3-1729ee24848a-kube-api-access-xnncm\") pod \"cinder-db-create-cl2fq\" (UID: \"11763665-9dfb-4894-94f3-1729ee24848a\") " pod="openstack/cinder-db-create-cl2fq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.159884 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11763665-9dfb-4894-94f3-1729ee24848a-operator-scripts\") pod \"cinder-db-create-cl2fq\" (UID: \"11763665-9dfb-4894-94f3-1729ee24848a\") " pod="openstack/cinder-db-create-cl2fq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.160572 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11763665-9dfb-4894-94f3-1729ee24848a-operator-scripts\") pod \"cinder-db-create-cl2fq\" (UID: \"11763665-9dfb-4894-94f3-1729ee24848a\") " pod="openstack/cinder-db-create-cl2fq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.193653 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnncm\" (UniqueName: \"kubernetes.io/projected/11763665-9dfb-4894-94f3-1729ee24848a-kube-api-access-xnncm\") pod \"cinder-db-create-cl2fq\" (UID: \"11763665-9dfb-4894-94f3-1729ee24848a\") " pod="openstack/cinder-db-create-cl2fq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.211333 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f632-account-create-update-cvcfq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.215930 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-bfc0-account-create-update-6lpng"] Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.216781 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bfc0-account-create-update-6lpng" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.219257 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.241061 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-p46mc"] Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.242045 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.249405 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bfc0-account-create-update-6lpng"] Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.251559 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.251971 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.252158 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-58h9w" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.256679 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.263300 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p46mc"] Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.386611 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm962\" (UniqueName: \"kubernetes.io/projected/1480b26d-86ec-4157-ae9d-d3333ccc2932-kube-api-access-sm962\") pod \"cinder-bfc0-account-create-update-6lpng\" (UID: \"1480b26d-86ec-4157-ae9d-d3333ccc2932\") " pod="openstack/cinder-bfc0-account-create-update-6lpng" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.386951 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6mvm\" (UniqueName: \"kubernetes.io/projected/4dcb4aef-66a4-452a-a29b-5d387373785e-kube-api-access-d6mvm\") pod \"keystone-db-sync-p46mc\" (UID: \"4dcb4aef-66a4-452a-a29b-5d387373785e\") " pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.387008 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1480b26d-86ec-4157-ae9d-d3333ccc2932-operator-scripts\") pod \"cinder-bfc0-account-create-update-6lpng\" (UID: \"1480b26d-86ec-4157-ae9d-d3333ccc2932\") " pod="openstack/cinder-bfc0-account-create-update-6lpng" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.387091 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dcb4aef-66a4-452a-a29b-5d387373785e-config-data\") pod \"keystone-db-sync-p46mc\" (UID: \"4dcb4aef-66a4-452a-a29b-5d387373785e\") " pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.387122 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dcb4aef-66a4-452a-a29b-5d387373785e-combined-ca-bundle\") pod \"keystone-db-sync-p46mc\" (UID: \"4dcb4aef-66a4-452a-a29b-5d387373785e\") " pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.406814 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cl2fq" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.524367 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm962\" (UniqueName: \"kubernetes.io/projected/1480b26d-86ec-4157-ae9d-d3333ccc2932-kube-api-access-sm962\") pod \"cinder-bfc0-account-create-update-6lpng\" (UID: \"1480b26d-86ec-4157-ae9d-d3333ccc2932\") " pod="openstack/cinder-bfc0-account-create-update-6lpng" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.524423 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6mvm\" (UniqueName: \"kubernetes.io/projected/4dcb4aef-66a4-452a-a29b-5d387373785e-kube-api-access-d6mvm\") pod \"keystone-db-sync-p46mc\" (UID: \"4dcb4aef-66a4-452a-a29b-5d387373785e\") " pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.524459 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1480b26d-86ec-4157-ae9d-d3333ccc2932-operator-scripts\") pod \"cinder-bfc0-account-create-update-6lpng\" (UID: \"1480b26d-86ec-4157-ae9d-d3333ccc2932\") " pod="openstack/cinder-bfc0-account-create-update-6lpng" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.524537 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dcb4aef-66a4-452a-a29b-5d387373785e-config-data\") pod \"keystone-db-sync-p46mc\" (UID: \"4dcb4aef-66a4-452a-a29b-5d387373785e\") " pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.524560 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dcb4aef-66a4-452a-a29b-5d387373785e-combined-ca-bundle\") pod \"keystone-db-sync-p46mc\" (UID: \"4dcb4aef-66a4-452a-a29b-5d387373785e\") " pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.525464 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1480b26d-86ec-4157-ae9d-d3333ccc2932-operator-scripts\") pod \"cinder-bfc0-account-create-update-6lpng\" (UID: \"1480b26d-86ec-4157-ae9d-d3333ccc2932\") " pod="openstack/cinder-bfc0-account-create-update-6lpng" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.530761 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dcb4aef-66a4-452a-a29b-5d387373785e-config-data\") pod \"keystone-db-sync-p46mc\" (UID: \"4dcb4aef-66a4-452a-a29b-5d387373785e\") " pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.533943 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-xhlf8"] Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.535295 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xhlf8" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.540564 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dcb4aef-66a4-452a-a29b-5d387373785e-combined-ca-bundle\") pod \"keystone-db-sync-p46mc\" (UID: \"4dcb4aef-66a4-452a-a29b-5d387373785e\") " pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.578586 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6mvm\" (UniqueName: \"kubernetes.io/projected/4dcb4aef-66a4-452a-a29b-5d387373785e-kube-api-access-d6mvm\") pod \"keystone-db-sync-p46mc\" (UID: \"4dcb4aef-66a4-452a-a29b-5d387373785e\") " pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.588923 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm962\" (UniqueName: \"kubernetes.io/projected/1480b26d-86ec-4157-ae9d-d3333ccc2932-kube-api-access-sm962\") pod \"cinder-bfc0-account-create-update-6lpng\" (UID: \"1480b26d-86ec-4157-ae9d-d3333ccc2932\") " pod="openstack/cinder-bfc0-account-create-update-6lpng" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.619252 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-01ba-account-create-update-wzhwm"] Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.621227 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-01ba-account-create-update-wzhwm" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.631179 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.632908 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xhlf8"] Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.641722 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-01ba-account-create-update-wzhwm"] Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.686808 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.732131 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brf62\" (UniqueName: \"kubernetes.io/projected/b5167f66-289b-4976-b502-640a327fa7bc-kube-api-access-brf62\") pod \"neutron-01ba-account-create-update-wzhwm\" (UID: \"b5167f66-289b-4976-b502-640a327fa7bc\") " pod="openstack/neutron-01ba-account-create-update-wzhwm" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.732187 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b35a721-4483-4f15-a0a4-b516b96f9c76-operator-scripts\") pod \"neutron-db-create-xhlf8\" (UID: \"2b35a721-4483-4f15-a0a4-b516b96f9c76\") " pod="openstack/neutron-db-create-xhlf8" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.732229 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5167f66-289b-4976-b502-640a327fa7bc-operator-scripts\") pod \"neutron-01ba-account-create-update-wzhwm\" (UID: \"b5167f66-289b-4976-b502-640a327fa7bc\") " pod="openstack/neutron-01ba-account-create-update-wzhwm" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.732286 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78zrw\" (UniqueName: \"kubernetes.io/projected/2b35a721-4483-4f15-a0a4-b516b96f9c76-kube-api-access-78zrw\") pod \"neutron-db-create-xhlf8\" (UID: \"2b35a721-4483-4f15-a0a4-b516b96f9c76\") " pod="openstack/neutron-db-create-xhlf8" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.835275 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5167f66-289b-4976-b502-640a327fa7bc-operator-scripts\") pod \"neutron-01ba-account-create-update-wzhwm\" (UID: \"b5167f66-289b-4976-b502-640a327fa7bc\") " pod="openstack/neutron-01ba-account-create-update-wzhwm" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.835352 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78zrw\" (UniqueName: \"kubernetes.io/projected/2b35a721-4483-4f15-a0a4-b516b96f9c76-kube-api-access-78zrw\") pod \"neutron-db-create-xhlf8\" (UID: \"2b35a721-4483-4f15-a0a4-b516b96f9c76\") " pod="openstack/neutron-db-create-xhlf8" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.835410 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brf62\" (UniqueName: \"kubernetes.io/projected/b5167f66-289b-4976-b502-640a327fa7bc-kube-api-access-brf62\") pod \"neutron-01ba-account-create-update-wzhwm\" (UID: \"b5167f66-289b-4976-b502-640a327fa7bc\") " pod="openstack/neutron-01ba-account-create-update-wzhwm" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.835442 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b35a721-4483-4f15-a0a4-b516b96f9c76-operator-scripts\") pod \"neutron-db-create-xhlf8\" (UID: \"2b35a721-4483-4f15-a0a4-b516b96f9c76\") " pod="openstack/neutron-db-create-xhlf8" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.836157 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b35a721-4483-4f15-a0a4-b516b96f9c76-operator-scripts\") pod \"neutron-db-create-xhlf8\" (UID: \"2b35a721-4483-4f15-a0a4-b516b96f9c76\") " pod="openstack/neutron-db-create-xhlf8" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.836522 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bfc0-account-create-update-6lpng" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.836834 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5167f66-289b-4976-b502-640a327fa7bc-operator-scripts\") pod \"neutron-01ba-account-create-update-wzhwm\" (UID: \"b5167f66-289b-4976-b502-640a327fa7bc\") " pod="openstack/neutron-01ba-account-create-update-wzhwm" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.894262 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brf62\" (UniqueName: \"kubernetes.io/projected/b5167f66-289b-4976-b502-640a327fa7bc-kube-api-access-brf62\") pod \"neutron-01ba-account-create-update-wzhwm\" (UID: \"b5167f66-289b-4976-b502-640a327fa7bc\") " pod="openstack/neutron-01ba-account-create-update-wzhwm" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.894785 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78zrw\" (UniqueName: \"kubernetes.io/projected/2b35a721-4483-4f15-a0a4-b516b96f9c76-kube-api-access-78zrw\") pod \"neutron-db-create-xhlf8\" (UID: \"2b35a721-4483-4f15-a0a4-b516b96f9c76\") " pod="openstack/neutron-db-create-xhlf8" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.923249 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b58a681-b228-4f2b-aa99-ea7777c4332c" path="/var/lib/kubelet/pods/8b58a681-b228-4f2b-aa99-ea7777c4332c/volumes" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.923883 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f632-account-create-update-cvcfq"] Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.928581 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2pxdl"] Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.991361 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-01ba-account-create-update-wzhwm" Dec 11 05:31:29 crc kubenswrapper[4628]: I1211 05:31:29.993739 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xhlf8" Dec 11 05:31:30 crc kubenswrapper[4628]: I1211 05:31:30.001117 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"c1a9912c155d76bbb1ce0915d614cf9cf9a53fabcc4fc524cc70b2456ff1c83b"} Dec 11 05:31:30 crc kubenswrapper[4628]: I1211 05:31:30.001164 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"09c40bcb6fe8244dbf6fdaba16e237d12fd5afe23d9f4267a0ee325ba8d34490"} Dec 11 05:31:30 crc kubenswrapper[4628]: I1211 05:31:30.269821 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cl2fq"] Dec 11 05:31:30 crc kubenswrapper[4628]: I1211 05:31:30.575258 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p46mc"] Dec 11 05:31:30 crc kubenswrapper[4628]: I1211 05:31:30.798067 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e4498a18-7449-45b3-9061-d3ffbfa4be5b" containerName="galera" probeResult="failure" output="command timed out" Dec 11 05:31:30 crc kubenswrapper[4628]: I1211 05:31:30.824205 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bfc0-account-create-update-6lpng"] Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.015477 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2pxdl" event={"ID":"2df1b74b-f6be-41c4-b9f1-6916553ef1d9","Type":"ContainerStarted","Data":"76d0959098e92c7f8c31c62866c6a21ad529ac4fe2ec60f4e70927e4ca5f3805"} Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.015520 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2pxdl" event={"ID":"2df1b74b-f6be-41c4-b9f1-6916553ef1d9","Type":"ContainerStarted","Data":"c16bbc70cd38e63fd1c42d2ec77e2b54c2d009efde225015c955c73ef2ab54cb"} Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.026635 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cl2fq" event={"ID":"11763665-9dfb-4894-94f3-1729ee24848a","Type":"ContainerStarted","Data":"5d9e0857c1105bf0db5e0cf3f0e2d1741fde853e1a390d3fc1294fb4422b07df"} Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.026694 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cl2fq" event={"ID":"11763665-9dfb-4894-94f3-1729ee24848a","Type":"ContainerStarted","Data":"a11dd99962d2d906a98583a3fb0db1eb051a224ce86ef2d0add10c317d8b14b2"} Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.033905 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bfc0-account-create-update-6lpng" event={"ID":"1480b26d-86ec-4157-ae9d-d3333ccc2932","Type":"ContainerStarted","Data":"ff64720b725f9eaf64705f7c9e9e3a7fac3710f09a858edf1b6bcc1d24e3f836"} Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.037185 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p46mc" event={"ID":"4dcb4aef-66a4-452a-a29b-5d387373785e","Type":"ContainerStarted","Data":"603c6938dcf9b98a0710f60fdb95b06e4911d7e55f8ca9dbb8a0af8ed9e78a77"} Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.048304 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f632-account-create-update-cvcfq" event={"ID":"c13e137b-1a8a-4965-8f85-04ad2b5ff488","Type":"ContainerStarted","Data":"88689b8c1e3f9b317f9ab4b507b04d82e9452615ae03345e83d14d2ce1b1ad2e"} Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.048341 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f632-account-create-update-cvcfq" event={"ID":"c13e137b-1a8a-4965-8f85-04ad2b5ff488","Type":"ContainerStarted","Data":"a492bf50bae81f06f2433b59649fb8a4ccf5a27af090357531857477d8188528"} Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.050215 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-2pxdl" podStartSLOduration=3.050205482 podStartE2EDuration="3.050205482s" podCreationTimestamp="2025-12-11 05:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:31:31.034131948 +0000 UTC m=+993.451478646" watchObservedRunningTime="2025-12-11 05:31:31.050205482 +0000 UTC m=+993.467552180" Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.054430 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-cl2fq" podStartSLOduration=3.054420527 podStartE2EDuration="3.054420527s" podCreationTimestamp="2025-12-11 05:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:31:31.050442368 +0000 UTC m=+993.467789066" watchObservedRunningTime="2025-12-11 05:31:31.054420527 +0000 UTC m=+993.471767225" Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.078944 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-f632-account-create-update-cvcfq" podStartSLOduration=3.078913313 podStartE2EDuration="3.078913313s" podCreationTimestamp="2025-12-11 05:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:31:31.070415769 +0000 UTC m=+993.487762457" watchObservedRunningTime="2025-12-11 05:31:31.078913313 +0000 UTC m=+993.496260011" Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.088617 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-01ba-account-create-update-wzhwm"] Dec 11 05:31:31 crc kubenswrapper[4628]: I1211 05:31:31.155016 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xhlf8"] Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.057649 4628 generic.go:334] "Generic (PLEG): container finished" podID="11763665-9dfb-4894-94f3-1729ee24848a" containerID="5d9e0857c1105bf0db5e0cf3f0e2d1741fde853e1a390d3fc1294fb4422b07df" exitCode=0 Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.057945 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cl2fq" event={"ID":"11763665-9dfb-4894-94f3-1729ee24848a","Type":"ContainerDied","Data":"5d9e0857c1105bf0db5e0cf3f0e2d1741fde853e1a390d3fc1294fb4422b07df"} Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.059415 4628 generic.go:334] "Generic (PLEG): container finished" podID="2b35a721-4483-4f15-a0a4-b516b96f9c76" containerID="f81e3b712c601ac3815979e6f05e26f4b5a91a2be5ec45cc68324b2d709acbbb" exitCode=0 Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.059455 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xhlf8" event={"ID":"2b35a721-4483-4f15-a0a4-b516b96f9c76","Type":"ContainerDied","Data":"f81e3b712c601ac3815979e6f05e26f4b5a91a2be5ec45cc68324b2d709acbbb"} Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.059468 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xhlf8" event={"ID":"2b35a721-4483-4f15-a0a4-b516b96f9c76","Type":"ContainerStarted","Data":"d928694f2c99702509d7c53674c820e1c7e84b37f353ce85323e9007ff64a4e8"} Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.062449 4628 generic.go:334] "Generic (PLEG): container finished" podID="1480b26d-86ec-4157-ae9d-d3333ccc2932" containerID="108dc06401e668ce1649c20dbf21c7fc5873e3119ee79d0dab68e6588f7f0bf3" exitCode=0 Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.062483 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bfc0-account-create-update-6lpng" event={"ID":"1480b26d-86ec-4157-ae9d-d3333ccc2932","Type":"ContainerDied","Data":"108dc06401e668ce1649c20dbf21c7fc5873e3119ee79d0dab68e6588f7f0bf3"} Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.063439 4628 generic.go:334] "Generic (PLEG): container finished" podID="c13e137b-1a8a-4965-8f85-04ad2b5ff488" containerID="88689b8c1e3f9b317f9ab4b507b04d82e9452615ae03345e83d14d2ce1b1ad2e" exitCode=0 Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.063470 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f632-account-create-update-cvcfq" event={"ID":"c13e137b-1a8a-4965-8f85-04ad2b5ff488","Type":"ContainerDied","Data":"88689b8c1e3f9b317f9ab4b507b04d82e9452615ae03345e83d14d2ce1b1ad2e"} Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.065120 4628 generic.go:334] "Generic (PLEG): container finished" podID="b5167f66-289b-4976-b502-640a327fa7bc" containerID="f88d95bf5233cd751b800a6bd4f135a25aa9dfd6ebd3dba962ce4cb44b5f561c" exitCode=0 Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.065157 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-01ba-account-create-update-wzhwm" event={"ID":"b5167f66-289b-4976-b502-640a327fa7bc","Type":"ContainerDied","Data":"f88d95bf5233cd751b800a6bd4f135a25aa9dfd6ebd3dba962ce4cb44b5f561c"} Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.065171 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-01ba-account-create-update-wzhwm" event={"ID":"b5167f66-289b-4976-b502-640a327fa7bc","Type":"ContainerStarted","Data":"da79dd415525cc6cefb7c349cdd8dbdbd65417e6705b58dc8218adcaa8d4685e"} Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.066070 4628 generic.go:334] "Generic (PLEG): container finished" podID="2df1b74b-f6be-41c4-b9f1-6916553ef1d9" containerID="76d0959098e92c7f8c31c62866c6a21ad529ac4fe2ec60f4e70927e4ca5f3805" exitCode=0 Dec 11 05:31:32 crc kubenswrapper[4628]: I1211 05:31:32.066090 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2pxdl" event={"ID":"2df1b74b-f6be-41c4-b9f1-6916553ef1d9","Type":"ContainerDied","Data":"76d0959098e92c7f8c31c62866c6a21ad529ac4fe2ec60f4e70927e4ca5f3805"} Dec 11 05:31:32 crc kubenswrapper[4628]: E1211 05:31:32.555880 4628 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.18:42554->38.102.83.18:36143: read tcp 38.102.83.18:42554->38.102.83.18:36143: read: connection reset by peer Dec 11 05:31:33 crc kubenswrapper[4628]: I1211 05:31:33.089415 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"6379948d7115284c5a4da5de1d5021abd7450d046abaae1030ba14b5fa55e3f7"} Dec 11 05:31:33 crc kubenswrapper[4628]: I1211 05:31:33.089764 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"d1d4c5d45c98ad1974664ec06dfb08f639c5cc2c861b249e12465bbbcae69fec"} Dec 11 05:31:33 crc kubenswrapper[4628]: I1211 05:31:33.089780 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"7c02da09fcca39fcde8522632cf923ce53432fb5f37e356ea56dd963e7ed0a6d"} Dec 11 05:31:33 crc kubenswrapper[4628]: I1211 05:31:33.089792 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"6eb12066a88f7859774fd6913bab4c681d6c75169c28f1bec87a81a1d7685396"} Dec 11 05:31:33 crc kubenswrapper[4628]: I1211 05:31:33.757856 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cl2fq" Dec 11 05:31:33 crc kubenswrapper[4628]: I1211 05:31:33.858513 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnncm\" (UniqueName: \"kubernetes.io/projected/11763665-9dfb-4894-94f3-1729ee24848a-kube-api-access-xnncm\") pod \"11763665-9dfb-4894-94f3-1729ee24848a\" (UID: \"11763665-9dfb-4894-94f3-1729ee24848a\") " Dec 11 05:31:33 crc kubenswrapper[4628]: I1211 05:31:33.865056 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11763665-9dfb-4894-94f3-1729ee24848a-kube-api-access-xnncm" (OuterVolumeSpecName: "kube-api-access-xnncm") pod "11763665-9dfb-4894-94f3-1729ee24848a" (UID: "11763665-9dfb-4894-94f3-1729ee24848a"). InnerVolumeSpecName "kube-api-access-xnncm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:33 crc kubenswrapper[4628]: I1211 05:31:33.960018 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11763665-9dfb-4894-94f3-1729ee24848a-operator-scripts\") pod \"11763665-9dfb-4894-94f3-1729ee24848a\" (UID: \"11763665-9dfb-4894-94f3-1729ee24848a\") " Dec 11 05:31:33 crc kubenswrapper[4628]: I1211 05:31:33.960505 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnncm\" (UniqueName: \"kubernetes.io/projected/11763665-9dfb-4894-94f3-1729ee24848a-kube-api-access-xnncm\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:33 crc kubenswrapper[4628]: I1211 05:31:33.960979 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11763665-9dfb-4894-94f3-1729ee24848a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11763665-9dfb-4894-94f3-1729ee24848a" (UID: "11763665-9dfb-4894-94f3-1729ee24848a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:33 crc kubenswrapper[4628]: I1211 05:31:33.991070 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2pxdl" Dec 11 05:31:33 crc kubenswrapper[4628]: I1211 05:31:33.995766 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-01ba-account-create-update-wzhwm" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.002385 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bfc0-account-create-update-6lpng" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.072392 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5167f66-289b-4976-b502-640a327fa7bc-operator-scripts\") pod \"b5167f66-289b-4976-b502-640a327fa7bc\" (UID: \"b5167f66-289b-4976-b502-640a327fa7bc\") " Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.072439 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2df1b74b-f6be-41c4-b9f1-6916553ef1d9-operator-scripts\") pod \"2df1b74b-f6be-41c4-b9f1-6916553ef1d9\" (UID: \"2df1b74b-f6be-41c4-b9f1-6916553ef1d9\") " Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.072474 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1480b26d-86ec-4157-ae9d-d3333ccc2932-operator-scripts\") pod \"1480b26d-86ec-4157-ae9d-d3333ccc2932\" (UID: \"1480b26d-86ec-4157-ae9d-d3333ccc2932\") " Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.072520 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldtxr\" (UniqueName: \"kubernetes.io/projected/2df1b74b-f6be-41c4-b9f1-6916553ef1d9-kube-api-access-ldtxr\") pod \"2df1b74b-f6be-41c4-b9f1-6916553ef1d9\" (UID: \"2df1b74b-f6be-41c4-b9f1-6916553ef1d9\") " Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.072580 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm962\" (UniqueName: \"kubernetes.io/projected/1480b26d-86ec-4157-ae9d-d3333ccc2932-kube-api-access-sm962\") pod \"1480b26d-86ec-4157-ae9d-d3333ccc2932\" (UID: \"1480b26d-86ec-4157-ae9d-d3333ccc2932\") " Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.072714 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brf62\" (UniqueName: \"kubernetes.io/projected/b5167f66-289b-4976-b502-640a327fa7bc-kube-api-access-brf62\") pod \"b5167f66-289b-4976-b502-640a327fa7bc\" (UID: \"b5167f66-289b-4976-b502-640a327fa7bc\") " Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.073122 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11763665-9dfb-4894-94f3-1729ee24848a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.074276 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5167f66-289b-4976-b502-640a327fa7bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5167f66-289b-4976-b502-640a327fa7bc" (UID: "b5167f66-289b-4976-b502-640a327fa7bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.074741 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df1b74b-f6be-41c4-b9f1-6916553ef1d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2df1b74b-f6be-41c4-b9f1-6916553ef1d9" (UID: "2df1b74b-f6be-41c4-b9f1-6916553ef1d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.075139 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1480b26d-86ec-4157-ae9d-d3333ccc2932-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1480b26d-86ec-4157-ae9d-d3333ccc2932" (UID: "1480b26d-86ec-4157-ae9d-d3333ccc2932"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.079473 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df1b74b-f6be-41c4-b9f1-6916553ef1d9-kube-api-access-ldtxr" (OuterVolumeSpecName: "kube-api-access-ldtxr") pod "2df1b74b-f6be-41c4-b9f1-6916553ef1d9" (UID: "2df1b74b-f6be-41c4-b9f1-6916553ef1d9"). InnerVolumeSpecName "kube-api-access-ldtxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.091554 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f632-account-create-update-cvcfq" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.091953 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xhlf8" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.097601 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bfc0-account-create-update-6lpng" event={"ID":"1480b26d-86ec-4157-ae9d-d3333ccc2932","Type":"ContainerDied","Data":"ff64720b725f9eaf64705f7c9e9e3a7fac3710f09a858edf1b6bcc1d24e3f836"} Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.097638 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff64720b725f9eaf64705f7c9e9e3a7fac3710f09a858edf1b6bcc1d24e3f836" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.097697 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bfc0-account-create-update-6lpng" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.104468 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f632-account-create-update-cvcfq" event={"ID":"c13e137b-1a8a-4965-8f85-04ad2b5ff488","Type":"ContainerDied","Data":"a492bf50bae81f06f2433b59649fb8a4ccf5a27af090357531857477d8188528"} Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.104500 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a492bf50bae81f06f2433b59649fb8a4ccf5a27af090357531857477d8188528" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.104555 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f632-account-create-update-cvcfq" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.114749 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5167f66-289b-4976-b502-640a327fa7bc-kube-api-access-brf62" (OuterVolumeSpecName: "kube-api-access-brf62") pod "b5167f66-289b-4976-b502-640a327fa7bc" (UID: "b5167f66-289b-4976-b502-640a327fa7bc"). InnerVolumeSpecName "kube-api-access-brf62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.124869 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1480b26d-86ec-4157-ae9d-d3333ccc2932-kube-api-access-sm962" (OuterVolumeSpecName: "kube-api-access-sm962") pod "1480b26d-86ec-4157-ae9d-d3333ccc2932" (UID: "1480b26d-86ec-4157-ae9d-d3333ccc2932"). InnerVolumeSpecName "kube-api-access-sm962". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.153199 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"31ea33ade6f081e5d71f8c26d758d84a6ddb171743c73422f2f8b5d0b26bb962"} Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.154969 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-01ba-account-create-update-wzhwm" event={"ID":"b5167f66-289b-4976-b502-640a327fa7bc","Type":"ContainerDied","Data":"da79dd415525cc6cefb7c349cdd8dbdbd65417e6705b58dc8218adcaa8d4685e"} Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.155002 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da79dd415525cc6cefb7c349cdd8dbdbd65417e6705b58dc8218adcaa8d4685e" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.155006 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-01ba-account-create-update-wzhwm" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.156539 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2pxdl" event={"ID":"2df1b74b-f6be-41c4-b9f1-6916553ef1d9","Type":"ContainerDied","Data":"c16bbc70cd38e63fd1c42d2ec77e2b54c2d009efde225015c955c73ef2ab54cb"} Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.156560 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c16bbc70cd38e63fd1c42d2ec77e2b54c2d009efde225015c955c73ef2ab54cb" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.156609 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2pxdl" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.168592 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cl2fq" event={"ID":"11763665-9dfb-4894-94f3-1729ee24848a","Type":"ContainerDied","Data":"a11dd99962d2d906a98583a3fb0db1eb051a224ce86ef2d0add10c317d8b14b2"} Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.168626 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a11dd99962d2d906a98583a3fb0db1eb051a224ce86ef2d0add10c317d8b14b2" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.168987 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cl2fq" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.176475 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cc9b\" (UniqueName: \"kubernetes.io/projected/c13e137b-1a8a-4965-8f85-04ad2b5ff488-kube-api-access-4cc9b\") pod \"c13e137b-1a8a-4965-8f85-04ad2b5ff488\" (UID: \"c13e137b-1a8a-4965-8f85-04ad2b5ff488\") " Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.176669 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13e137b-1a8a-4965-8f85-04ad2b5ff488-operator-scripts\") pod \"c13e137b-1a8a-4965-8f85-04ad2b5ff488\" (UID: \"c13e137b-1a8a-4965-8f85-04ad2b5ff488\") " Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.176902 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b35a721-4483-4f15-a0a4-b516b96f9c76-operator-scripts\") pod \"2b35a721-4483-4f15-a0a4-b516b96f9c76\" (UID: \"2b35a721-4483-4f15-a0a4-b516b96f9c76\") " Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.177002 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78zrw\" (UniqueName: \"kubernetes.io/projected/2b35a721-4483-4f15-a0a4-b516b96f9c76-kube-api-access-78zrw\") pod \"2b35a721-4483-4f15-a0a4-b516b96f9c76\" (UID: \"2b35a721-4483-4f15-a0a4-b516b96f9c76\") " Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.177978 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b35a721-4483-4f15-a0a4-b516b96f9c76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b35a721-4483-4f15-a0a4-b516b96f9c76" (UID: "2b35a721-4483-4f15-a0a4-b516b96f9c76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.178349 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c13e137b-1a8a-4965-8f85-04ad2b5ff488-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c13e137b-1a8a-4965-8f85-04ad2b5ff488" (UID: "c13e137b-1a8a-4965-8f85-04ad2b5ff488"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.178690 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2df1b74b-f6be-41c4-b9f1-6916553ef1d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.178762 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1480b26d-86ec-4157-ae9d-d3333ccc2932-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.178817 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b35a721-4483-4f15-a0a4-b516b96f9c76-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.178897 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldtxr\" (UniqueName: \"kubernetes.io/projected/2df1b74b-f6be-41c4-b9f1-6916553ef1d9-kube-api-access-ldtxr\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.178964 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm962\" (UniqueName: \"kubernetes.io/projected/1480b26d-86ec-4157-ae9d-d3333ccc2932-kube-api-access-sm962\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.179049 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brf62\" (UniqueName: \"kubernetes.io/projected/b5167f66-289b-4976-b502-640a327fa7bc-kube-api-access-brf62\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.179106 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c13e137b-1a8a-4965-8f85-04ad2b5ff488-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.179159 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5167f66-289b-4976-b502-640a327fa7bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.179941 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b35a721-4483-4f15-a0a4-b516b96f9c76-kube-api-access-78zrw" (OuterVolumeSpecName: "kube-api-access-78zrw") pod "2b35a721-4483-4f15-a0a4-b516b96f9c76" (UID: "2b35a721-4483-4f15-a0a4-b516b96f9c76"). InnerVolumeSpecName "kube-api-access-78zrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.180862 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xhlf8" event={"ID":"2b35a721-4483-4f15-a0a4-b516b96f9c76","Type":"ContainerDied","Data":"d928694f2c99702509d7c53674c820e1c7e84b37f353ce85323e9007ff64a4e8"} Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.180896 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d928694f2c99702509d7c53674c820e1c7e84b37f353ce85323e9007ff64a4e8" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.180948 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xhlf8" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.183356 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13e137b-1a8a-4965-8f85-04ad2b5ff488-kube-api-access-4cc9b" (OuterVolumeSpecName: "kube-api-access-4cc9b") pod "c13e137b-1a8a-4965-8f85-04ad2b5ff488" (UID: "c13e137b-1a8a-4965-8f85-04ad2b5ff488"). InnerVolumeSpecName "kube-api-access-4cc9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.285787 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cc9b\" (UniqueName: \"kubernetes.io/projected/c13e137b-1a8a-4965-8f85-04ad2b5ff488-kube-api-access-4cc9b\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:34 crc kubenswrapper[4628]: I1211 05:31:34.285818 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78zrw\" (UniqueName: \"kubernetes.io/projected/2b35a721-4483-4f15-a0a4-b516b96f9c76-kube-api-access-78zrw\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.195606 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"9e79dedccbfebddc984161798e6c42d1f040c6937f7957cce38e43304dec841b"} Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.195935 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612f2afd-9958-4367-a8c0-13066a05cd11","Type":"ContainerStarted","Data":"08d1e3e7253922a104a8009bd509176d140f0a4006117017c5e19be64b55cfba"} Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.231598 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.847479687 podStartE2EDuration="40.231580097s" podCreationTimestamp="2025-12-11 05:30:55 +0000 UTC" firstStartedPulling="2025-12-11 05:31:13.712176303 +0000 UTC m=+976.129523001" lastFinishedPulling="2025-12-11 05:31:32.096276713 +0000 UTC m=+994.513623411" observedRunningTime="2025-12-11 05:31:35.226486387 +0000 UTC m=+997.643833105" watchObservedRunningTime="2025-12-11 05:31:35.231580097 +0000 UTC m=+997.648926795" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.497703 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v5zrj"] Dec 11 05:31:35 crc kubenswrapper[4628]: E1211 05:31:35.498124 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5167f66-289b-4976-b502-640a327fa7bc" containerName="mariadb-account-create-update" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.498142 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5167f66-289b-4976-b502-640a327fa7bc" containerName="mariadb-account-create-update" Dec 11 05:31:35 crc kubenswrapper[4628]: E1211 05:31:35.498149 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b35a721-4483-4f15-a0a4-b516b96f9c76" containerName="mariadb-database-create" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.498155 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b35a721-4483-4f15-a0a4-b516b96f9c76" containerName="mariadb-database-create" Dec 11 05:31:35 crc kubenswrapper[4628]: E1211 05:31:35.498176 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1480b26d-86ec-4157-ae9d-d3333ccc2932" containerName="mariadb-account-create-update" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.498183 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="1480b26d-86ec-4157-ae9d-d3333ccc2932" containerName="mariadb-account-create-update" Dec 11 05:31:35 crc kubenswrapper[4628]: E1211 05:31:35.498192 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11763665-9dfb-4894-94f3-1729ee24848a" containerName="mariadb-database-create" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.498199 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="11763665-9dfb-4894-94f3-1729ee24848a" containerName="mariadb-database-create" Dec 11 05:31:35 crc kubenswrapper[4628]: E1211 05:31:35.498213 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df1b74b-f6be-41c4-b9f1-6916553ef1d9" containerName="mariadb-database-create" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.498220 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df1b74b-f6be-41c4-b9f1-6916553ef1d9" containerName="mariadb-database-create" Dec 11 05:31:35 crc kubenswrapper[4628]: E1211 05:31:35.498232 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13e137b-1a8a-4965-8f85-04ad2b5ff488" containerName="mariadb-account-create-update" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.498238 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13e137b-1a8a-4965-8f85-04ad2b5ff488" containerName="mariadb-account-create-update" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.498392 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b35a721-4483-4f15-a0a4-b516b96f9c76" containerName="mariadb-database-create" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.498406 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="1480b26d-86ec-4157-ae9d-d3333ccc2932" containerName="mariadb-account-create-update" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.498415 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df1b74b-f6be-41c4-b9f1-6916553ef1d9" containerName="mariadb-database-create" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.498433 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="11763665-9dfb-4894-94f3-1729ee24848a" containerName="mariadb-database-create" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.498448 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13e137b-1a8a-4965-8f85-04ad2b5ff488" containerName="mariadb-account-create-update" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.498470 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5167f66-289b-4976-b502-640a327fa7bc" containerName="mariadb-account-create-update" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.499291 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.501775 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.504545 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-dns-svc\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.504601 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wzfk\" (UniqueName: \"kubernetes.io/projected/bb39a2b9-4933-4004-97eb-669d61fade13-kube-api-access-8wzfk\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.504629 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.504693 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.504807 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.504933 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-config\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.517522 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v5zrj"] Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.606265 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.606402 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-config\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.606579 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-dns-svc\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.606633 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wzfk\" (UniqueName: \"kubernetes.io/projected/bb39a2b9-4933-4004-97eb-669d61fade13-kube-api-access-8wzfk\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.606660 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.606771 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.607065 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.609269 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-config\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.609350 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.609381 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-dns-svc\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.609825 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.639829 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wzfk\" (UniqueName: \"kubernetes.io/projected/bb39a2b9-4933-4004-97eb-669d61fade13-kube-api-access-8wzfk\") pod \"dnsmasq-dns-764c5664d7-v5zrj\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:35 crc kubenswrapper[4628]: I1211 05:31:35.839212 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:38 crc kubenswrapper[4628]: I1211 05:31:38.420997 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v5zrj"] Dec 11 05:31:39 crc kubenswrapper[4628]: I1211 05:31:39.244278 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p46mc" event={"ID":"4dcb4aef-66a4-452a-a29b-5d387373785e","Type":"ContainerStarted","Data":"b2ff9d482808e0260af9e769ffb830176737c5e86d0a0cf19b631f2b64bdebb1"} Dec 11 05:31:39 crc kubenswrapper[4628]: I1211 05:31:39.247112 4628 generic.go:334] "Generic (PLEG): container finished" podID="bb39a2b9-4933-4004-97eb-669d61fade13" containerID="c98a9969b117f65daeac7a1f9b8c35d3a40e5cf3ce6d3b9993355293b57f72b7" exitCode=0 Dec 11 05:31:39 crc kubenswrapper[4628]: I1211 05:31:39.247166 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" event={"ID":"bb39a2b9-4933-4004-97eb-669d61fade13","Type":"ContainerDied","Data":"c98a9969b117f65daeac7a1f9b8c35d3a40e5cf3ce6d3b9993355293b57f72b7"} Dec 11 05:31:39 crc kubenswrapper[4628]: I1211 05:31:39.247186 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" event={"ID":"bb39a2b9-4933-4004-97eb-669d61fade13","Type":"ContainerStarted","Data":"a65ba2f594128a882ca5c9ae9cb7fd5c12c405a4668a4f38e99fa180054f2da3"} Dec 11 05:31:39 crc kubenswrapper[4628]: I1211 05:31:39.253714 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ksb9p" event={"ID":"98c407a5-95e8-4036-becd-3522286435d5","Type":"ContainerStarted","Data":"f5cb61ff69c4cc2f3f37f09a5f1c204ec02775c42134b14406f6a1af785698b2"} Dec 11 05:31:39 crc kubenswrapper[4628]: I1211 05:31:39.306983 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-p46mc" podStartSLOduration=2.945423272 podStartE2EDuration="10.30696604s" podCreationTimestamp="2025-12-11 05:31:29 +0000 UTC" firstStartedPulling="2025-12-11 05:31:30.596677043 +0000 UTC m=+993.014023731" lastFinishedPulling="2025-12-11 05:31:37.958219801 +0000 UTC m=+1000.375566499" observedRunningTime="2025-12-11 05:31:39.270220687 +0000 UTC m=+1001.687567385" watchObservedRunningTime="2025-12-11 05:31:39.30696604 +0000 UTC m=+1001.724312738" Dec 11 05:31:39 crc kubenswrapper[4628]: I1211 05:31:39.334388 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ksb9p" podStartSLOduration=2.400339016 podStartE2EDuration="32.334351356s" podCreationTimestamp="2025-12-11 05:31:07 +0000 UTC" firstStartedPulling="2025-12-11 05:31:08.023313346 +0000 UTC m=+970.440660044" lastFinishedPulling="2025-12-11 05:31:37.957325666 +0000 UTC m=+1000.374672384" observedRunningTime="2025-12-11 05:31:39.328719521 +0000 UTC m=+1001.746066229" watchObservedRunningTime="2025-12-11 05:31:39.334351356 +0000 UTC m=+1001.751698044" Dec 11 05:31:40 crc kubenswrapper[4628]: I1211 05:31:40.262188 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" event={"ID":"bb39a2b9-4933-4004-97eb-669d61fade13","Type":"ContainerStarted","Data":"ff1429699756acfb78f316eccec1624d035e6506673b81c42821581891bf56ab"} Dec 11 05:31:40 crc kubenswrapper[4628]: I1211 05:31:40.263172 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:40 crc kubenswrapper[4628]: I1211 05:31:40.287276 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" podStartSLOduration=5.287256458 podStartE2EDuration="5.287256458s" podCreationTimestamp="2025-12-11 05:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:31:40.284739518 +0000 UTC m=+1002.702086206" watchObservedRunningTime="2025-12-11 05:31:40.287256458 +0000 UTC m=+1002.704603156" Dec 11 05:31:42 crc kubenswrapper[4628]: I1211 05:31:42.290529 4628 generic.go:334] "Generic (PLEG): container finished" podID="4dcb4aef-66a4-452a-a29b-5d387373785e" containerID="b2ff9d482808e0260af9e769ffb830176737c5e86d0a0cf19b631f2b64bdebb1" exitCode=0 Dec 11 05:31:42 crc kubenswrapper[4628]: I1211 05:31:42.290695 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p46mc" event={"ID":"4dcb4aef-66a4-452a-a29b-5d387373785e","Type":"ContainerDied","Data":"b2ff9d482808e0260af9e769ffb830176737c5e86d0a0cf19b631f2b64bdebb1"} Dec 11 05:31:43 crc kubenswrapper[4628]: I1211 05:31:43.663238 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:43 crc kubenswrapper[4628]: I1211 05:31:43.847314 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dcb4aef-66a4-452a-a29b-5d387373785e-config-data\") pod \"4dcb4aef-66a4-452a-a29b-5d387373785e\" (UID: \"4dcb4aef-66a4-452a-a29b-5d387373785e\") " Dec 11 05:31:43 crc kubenswrapper[4628]: I1211 05:31:43.847450 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6mvm\" (UniqueName: \"kubernetes.io/projected/4dcb4aef-66a4-452a-a29b-5d387373785e-kube-api-access-d6mvm\") pod \"4dcb4aef-66a4-452a-a29b-5d387373785e\" (UID: \"4dcb4aef-66a4-452a-a29b-5d387373785e\") " Dec 11 05:31:43 crc kubenswrapper[4628]: I1211 05:31:43.847519 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dcb4aef-66a4-452a-a29b-5d387373785e-combined-ca-bundle\") pod \"4dcb4aef-66a4-452a-a29b-5d387373785e\" (UID: \"4dcb4aef-66a4-452a-a29b-5d387373785e\") " Dec 11 05:31:43 crc kubenswrapper[4628]: I1211 05:31:43.864021 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dcb4aef-66a4-452a-a29b-5d387373785e-kube-api-access-d6mvm" (OuterVolumeSpecName: "kube-api-access-d6mvm") pod "4dcb4aef-66a4-452a-a29b-5d387373785e" (UID: "4dcb4aef-66a4-452a-a29b-5d387373785e"). InnerVolumeSpecName "kube-api-access-d6mvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:43 crc kubenswrapper[4628]: I1211 05:31:43.876633 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dcb4aef-66a4-452a-a29b-5d387373785e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dcb4aef-66a4-452a-a29b-5d387373785e" (UID: "4dcb4aef-66a4-452a-a29b-5d387373785e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:31:43 crc kubenswrapper[4628]: I1211 05:31:43.915456 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dcb4aef-66a4-452a-a29b-5d387373785e-config-data" (OuterVolumeSpecName: "config-data") pod "4dcb4aef-66a4-452a-a29b-5d387373785e" (UID: "4dcb4aef-66a4-452a-a29b-5d387373785e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:31:43 crc kubenswrapper[4628]: I1211 05:31:43.949398 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dcb4aef-66a4-452a-a29b-5d387373785e-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:43 crc kubenswrapper[4628]: I1211 05:31:43.949437 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6mvm\" (UniqueName: \"kubernetes.io/projected/4dcb4aef-66a4-452a-a29b-5d387373785e-kube-api-access-d6mvm\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:43 crc kubenswrapper[4628]: I1211 05:31:43.949452 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dcb4aef-66a4-452a-a29b-5d387373785e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.312894 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p46mc" event={"ID":"4dcb4aef-66a4-452a-a29b-5d387373785e","Type":"ContainerDied","Data":"603c6938dcf9b98a0710f60fdb95b06e4911d7e55f8ca9dbb8a0af8ed9e78a77"} Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.312936 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="603c6938dcf9b98a0710f60fdb95b06e4911d7e55f8ca9dbb8a0af8ed9e78a77" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.312998 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p46mc" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.634884 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lqk5g"] Dec 11 05:31:44 crc kubenswrapper[4628]: E1211 05:31:44.635261 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dcb4aef-66a4-452a-a29b-5d387373785e" containerName="keystone-db-sync" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.635282 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dcb4aef-66a4-452a-a29b-5d387373785e" containerName="keystone-db-sync" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.635432 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dcb4aef-66a4-452a-a29b-5d387373785e" containerName="keystone-db-sync" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.635977 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.638777 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.638777 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.638923 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.639080 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-58h9w" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.645624 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.662335 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-config-data\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.662554 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ns64\" (UniqueName: \"kubernetes.io/projected/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-kube-api-access-6ns64\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.662653 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-fernet-keys\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.662749 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-scripts\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.662838 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-credential-keys\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.671115 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-combined-ca-bundle\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.671932 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lqk5g"] Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.679922 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v5zrj"] Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.680246 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" podUID="bb39a2b9-4933-4004-97eb-669d61fade13" containerName="dnsmasq-dns" containerID="cri-o://ff1429699756acfb78f316eccec1624d035e6506673b81c42821581891bf56ab" gracePeriod=10 Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.697092 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.729791 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-58jzx"] Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.742429 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.760888 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-58jzx"] Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.775690 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-config-data\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.775959 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ns64\" (UniqueName: \"kubernetes.io/projected/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-kube-api-access-6ns64\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.776066 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-fernet-keys\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.776159 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l92lx\" (UniqueName: \"kubernetes.io/projected/18c07bbf-1357-4d94-afb7-0620ac8222eb-kube-api-access-l92lx\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.776258 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-scripts\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.776329 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.776420 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-config\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.776565 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.776633 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-credential-keys\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.776714 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.776797 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-dns-svc\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.776941 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-combined-ca-bundle\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.788018 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-scripts\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.788390 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-credential-keys\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.788894 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-combined-ca-bundle\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.796361 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-fernet-keys\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.805881 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-config-data\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.884431 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ns64\" (UniqueName: \"kubernetes.io/projected/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-kube-api-access-6ns64\") pod \"keystone-bootstrap-lqk5g\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.890743 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.899620 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.916299 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-dns-svc\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.917054 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l92lx\" (UniqueName: \"kubernetes.io/projected/18c07bbf-1357-4d94-afb7-0620ac8222eb-kube-api-access-l92lx\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.917130 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.917197 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-config\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.917225 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.917923 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-dns-svc\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.918149 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.919283 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.919613 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-config\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.963563 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.962989 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-768bfb8b9-pbxmb"] Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.982966 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l92lx\" (UniqueName: \"kubernetes.io/projected/18c07bbf-1357-4d94-afb7-0620ac8222eb-kube-api-access-l92lx\") pod \"dnsmasq-dns-5959f8865f-58jzx\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.983806 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:44 crc kubenswrapper[4628]: I1211 05:31:44.990484 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-768bfb8b9-pbxmb"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.014128 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-n2b6t"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.015412 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.020342 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.020523 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.020568 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-nl8jm" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.020734 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-57cjn" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.020907 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.020999 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.021078 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.021839 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-horizon-secret-key\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.021883 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-scripts\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.021965 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-logs\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.022010 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lcxk\" (UniqueName: \"kubernetes.io/projected/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-kube-api-access-9lcxk\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.022061 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-config-data\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.081428 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-n2b6t"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.127638 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-horizon-secret-key\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.127702 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-scripts\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.127725 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-scripts\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.127773 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp4x5\" (UniqueName: \"kubernetes.io/projected/38627c48-4a86-4721-874d-8f386ea24495-kube-api-access-hp4x5\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.127806 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-db-sync-config-data\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.127831 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-logs\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.127876 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-config-data\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.127915 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38627c48-4a86-4721-874d-8f386ea24495-etc-machine-id\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.127937 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lcxk\" (UniqueName: \"kubernetes.io/projected/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-kube-api-access-9lcxk\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.127957 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-combined-ca-bundle\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.127991 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-config-data\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.129089 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-config-data\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.130866 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-logs\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.131351 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-scripts\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.136666 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-horizon-secret-key\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.166930 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-zvwzz"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.168291 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.170250 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-nd79k"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.171375 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nd79k" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.178510 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.178767 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.178993 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-gz72p" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.179306 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bh5kp" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.179567 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.184322 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lcxk\" (UniqueName: \"kubernetes.io/projected/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-kube-api-access-9lcxk\") pod \"horizon-768bfb8b9-pbxmb\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.190528 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nd79k"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.206882 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zvwzz"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.230871 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-scripts\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.230936 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-db-sync-config-data\") pod \"barbican-db-sync-nd79k\" (UID: \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\") " pod="openstack/barbican-db-sync-nd79k" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.230964 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2x4d\" (UniqueName: \"kubernetes.io/projected/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-kube-api-access-k2x4d\") pod \"neutron-db-sync-zvwzz\" (UID: \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\") " pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.230986 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-combined-ca-bundle\") pod \"neutron-db-sync-zvwzz\" (UID: \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\") " pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.231006 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp4x5\" (UniqueName: \"kubernetes.io/projected/38627c48-4a86-4721-874d-8f386ea24495-kube-api-access-hp4x5\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.231038 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-db-sync-config-data\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.231082 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-config-data\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.231117 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38627c48-4a86-4721-874d-8f386ea24495-etc-machine-id\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.231134 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-combined-ca-bundle\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.231153 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2wpr\" (UniqueName: \"kubernetes.io/projected/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-kube-api-access-p2wpr\") pod \"barbican-db-sync-nd79k\" (UID: \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\") " pod="openstack/barbican-db-sync-nd79k" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.231176 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-combined-ca-bundle\") pod \"barbican-db-sync-nd79k\" (UID: \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\") " pod="openstack/barbican-db-sync-nd79k" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.231195 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-config\") pod \"neutron-db-sync-zvwzz\" (UID: \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\") " pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.232289 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38627c48-4a86-4721-874d-8f386ea24495-etc-machine-id\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.232529 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f8668ff77-t645d"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.233984 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.237494 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-scripts\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.238124 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-db-sync-config-data\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.247606 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.257933 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-combined-ca-bundle\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.258924 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-config-data\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.279111 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp4x5\" (UniqueName: \"kubernetes.io/projected/38627c48-4a86-4721-874d-8f386ea24495-kube-api-access-hp4x5\") pod \"cinder-db-sync-n2b6t\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.297257 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-58jzx"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.333157 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-db-sync-config-data\") pod \"barbican-db-sync-nd79k\" (UID: \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\") " pod="openstack/barbican-db-sync-nd79k" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.333201 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxsxm\" (UniqueName: \"kubernetes.io/projected/cfadba6b-22f6-471b-b328-c17e9b74e67a-kube-api-access-lxsxm\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.333227 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2x4d\" (UniqueName: \"kubernetes.io/projected/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-kube-api-access-k2x4d\") pod \"neutron-db-sync-zvwzz\" (UID: \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\") " pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.333246 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-combined-ca-bundle\") pod \"neutron-db-sync-zvwzz\" (UID: \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\") " pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.333272 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfadba6b-22f6-471b-b328-c17e9b74e67a-config-data\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.333336 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2wpr\" (UniqueName: \"kubernetes.io/projected/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-kube-api-access-p2wpr\") pod \"barbican-db-sync-nd79k\" (UID: \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\") " pod="openstack/barbican-db-sync-nd79k" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.333355 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-combined-ca-bundle\") pod \"barbican-db-sync-nd79k\" (UID: \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\") " pod="openstack/barbican-db-sync-nd79k" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.333372 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-config\") pod \"neutron-db-sync-zvwzz\" (UID: \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\") " pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.333394 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfadba6b-22f6-471b-b328-c17e9b74e67a-logs\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.333431 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cfadba6b-22f6-471b-b328-c17e9b74e67a-horizon-secret-key\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.333451 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfadba6b-22f6-471b-b328-c17e9b74e67a-scripts\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.335334 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8b969"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.337290 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.341216 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-combined-ca-bundle\") pod \"neutron-db-sync-zvwzz\" (UID: \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\") " pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.342202 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.342251 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.342276 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-combined-ca-bundle\") pod \"barbican-db-sync-nd79k\" (UID: \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\") " pod="openstack/barbican-db-sync-nd79k" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.342425 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ggglx" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.371761 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2x4d\" (UniqueName: \"kubernetes.io/projected/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-kube-api-access-k2x4d\") pod \"neutron-db-sync-zvwzz\" (UID: \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\") " pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.371935 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-config\") pod \"neutron-db-sync-zvwzz\" (UID: \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\") " pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.372371 4628 generic.go:334] "Generic (PLEG): container finished" podID="bb39a2b9-4933-4004-97eb-669d61fade13" containerID="ff1429699756acfb78f316eccec1624d035e6506673b81c42821581891bf56ab" exitCode=0 Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.372402 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" event={"ID":"bb39a2b9-4933-4004-97eb-669d61fade13","Type":"ContainerDied","Data":"ff1429699756acfb78f316eccec1624d035e6506673b81c42821581891bf56ab"} Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.374162 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.374198 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f8668ff77-t645d"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.375621 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2wpr\" (UniqueName: \"kubernetes.io/projected/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-kube-api-access-p2wpr\") pod \"barbican-db-sync-nd79k\" (UID: \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\") " pod="openstack/barbican-db-sync-nd79k" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.385249 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-db-sync-config-data\") pod \"barbican-db-sync-nd79k\" (UID: \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\") " pod="openstack/barbican-db-sync-nd79k" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.404248 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8b969"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.413198 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.427172 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-dlqkw"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.428526 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.435816 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfadba6b-22f6-471b-b328-c17e9b74e67a-logs\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.435889 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cfadba6b-22f6-471b-b328-c17e9b74e67a-horizon-secret-key\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.435910 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfadba6b-22f6-471b-b328-c17e9b74e67a-scripts\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.435946 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxsxm\" (UniqueName: \"kubernetes.io/projected/cfadba6b-22f6-471b-b328-c17e9b74e67a-kube-api-access-lxsxm\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.435978 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfadba6b-22f6-471b-b328-c17e9b74e67a-config-data\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.439107 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfadba6b-22f6-471b-b328-c17e9b74e67a-logs\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.446810 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfadba6b-22f6-471b-b328-c17e9b74e67a-config-data\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.460349 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-dlqkw"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.463519 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfadba6b-22f6-471b-b328-c17e9b74e67a-scripts\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.479666 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.491118 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.498402 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.498606 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.505879 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cfadba6b-22f6-471b-b328-c17e9b74e67a-horizon-secret-key\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.506049 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.508867 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxsxm\" (UniqueName: \"kubernetes.io/projected/cfadba6b-22f6-471b-b328-c17e9b74e67a-kube-api-access-lxsxm\") pod \"horizon-f8668ff77-t645d\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.514729 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nd79k" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.537919 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554385 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554505 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-scripts\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554571 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554618 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554643 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-config-data\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554687 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njt7h\" (UniqueName: \"kubernetes.io/projected/48b1c132-b854-4494-9e51-d934e9946366-kube-api-access-njt7h\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554710 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-config-data\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554750 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-config\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554771 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48b1c132-b854-4494-9e51-d934e9946366-run-httpd\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554790 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-combined-ca-bundle\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554835 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48b1c132-b854-4494-9e51-d934e9946366-log-httpd\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554871 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9jvz\" (UniqueName: \"kubernetes.io/projected/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-kube-api-access-h9jvz\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554888 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554910 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-scripts\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554929 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554954 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554980 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-logs\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.554998 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5m5m\" (UniqueName: \"kubernetes.io/projected/061f7965-d09e-4f01-9ee8-06638befdf0c-kube-api-access-b5m5m\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.580324 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.657914 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-scripts\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658266 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658293 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658312 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-config-data\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658340 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-config-data\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658355 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njt7h\" (UniqueName: \"kubernetes.io/projected/48b1c132-b854-4494-9e51-d934e9946366-kube-api-access-njt7h\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658383 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-config\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658400 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48b1c132-b854-4494-9e51-d934e9946366-run-httpd\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658419 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-combined-ca-bundle\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658446 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48b1c132-b854-4494-9e51-d934e9946366-log-httpd\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658466 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9jvz\" (UniqueName: \"kubernetes.io/projected/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-kube-api-access-h9jvz\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658484 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658502 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-scripts\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658518 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658543 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658564 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-logs\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658583 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5m5m\" (UniqueName: \"kubernetes.io/projected/061f7965-d09e-4f01-9ee8-06638befdf0c-kube-api-access-b5m5m\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.658609 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.659543 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.670985 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-config\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.671687 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.672198 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.672392 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48b1c132-b854-4494-9e51-d934e9946366-log-httpd\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.672630 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48b1c132-b854-4494-9e51-d934e9946366-run-httpd\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.672947 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-logs\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.673439 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.673629 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.680778 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.681219 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-scripts\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.681598 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-combined-ca-bundle\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.681808 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-scripts\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.696813 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-config-data\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.700692 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-config-data\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.706081 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njt7h\" (UniqueName: \"kubernetes.io/projected/48b1c132-b854-4494-9e51-d934e9946366-kube-api-access-njt7h\") pod \"ceilometer-0\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.718457 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5m5m\" (UniqueName: \"kubernetes.io/projected/061f7965-d09e-4f01-9ee8-06638befdf0c-kube-api-access-b5m5m\") pod \"dnsmasq-dns-58dd9ff6bc-dlqkw\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.720771 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9jvz\" (UniqueName: \"kubernetes.io/projected/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-kube-api-access-h9jvz\") pod \"placement-db-sync-8b969\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " pod="openstack/placement-db-sync-8b969" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.808638 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.829192 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:31:45 crc kubenswrapper[4628]: I1211 05:31:45.989779 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8b969" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.343830 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.350503 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lqk5g"] Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.395795 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-ovsdbserver-nb\") pod \"bb39a2b9-4933-4004-97eb-669d61fade13\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.396245 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-dns-svc\") pod \"bb39a2b9-4933-4004-97eb-669d61fade13\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.396311 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-config\") pod \"bb39a2b9-4933-4004-97eb-669d61fade13\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.396370 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-ovsdbserver-sb\") pod \"bb39a2b9-4933-4004-97eb-669d61fade13\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.396399 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-dns-swift-storage-0\") pod \"bb39a2b9-4933-4004-97eb-669d61fade13\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.396448 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wzfk\" (UniqueName: \"kubernetes.io/projected/bb39a2b9-4933-4004-97eb-669d61fade13-kube-api-access-8wzfk\") pod \"bb39a2b9-4933-4004-97eb-669d61fade13\" (UID: \"bb39a2b9-4933-4004-97eb-669d61fade13\") " Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.401401 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb39a2b9-4933-4004-97eb-669d61fade13-kube-api-access-8wzfk" (OuterVolumeSpecName: "kube-api-access-8wzfk") pod "bb39a2b9-4933-4004-97eb-669d61fade13" (UID: "bb39a2b9-4933-4004-97eb-669d61fade13"). InnerVolumeSpecName "kube-api-access-8wzfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.406639 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-58jzx"] Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.416929 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lqk5g" event={"ID":"0b98d4b7-cda3-44ff-87b6-8515b4ea082a","Type":"ContainerStarted","Data":"6ff4b767e036c0b4ca2e5299c6a1a6a73ff8541f2efc0a0d82eb8fc118fd423f"} Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.427695 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" event={"ID":"bb39a2b9-4933-4004-97eb-669d61fade13","Type":"ContainerDied","Data":"a65ba2f594128a882ca5c9ae9cb7fd5c12c405a4668a4f38e99fa180054f2da3"} Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.427780 4628 scope.go:117] "RemoveContainer" containerID="ff1429699756acfb78f316eccec1624d035e6506673b81c42821581891bf56ab" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.428826 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.440198 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-58jzx" event={"ID":"18c07bbf-1357-4d94-afb7-0620ac8222eb","Type":"ContainerStarted","Data":"35eff1a528c6e6154105d0fdab4625e837f79907f830d5b928ae36bfb4341252"} Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.503598 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wzfk\" (UniqueName: \"kubernetes.io/projected/bb39a2b9-4933-4004-97eb-669d61fade13-kube-api-access-8wzfk\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.515069 4628 scope.go:117] "RemoveContainer" containerID="c98a9969b117f65daeac7a1f9b8c35d3a40e5cf3ce6d3b9993355293b57f72b7" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.531788 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb39a2b9-4933-4004-97eb-669d61fade13" (UID: "bb39a2b9-4933-4004-97eb-669d61fade13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.552406 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb39a2b9-4933-4004-97eb-669d61fade13" (UID: "bb39a2b9-4933-4004-97eb-669d61fade13"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.558889 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-config" (OuterVolumeSpecName: "config") pod "bb39a2b9-4933-4004-97eb-669d61fade13" (UID: "bb39a2b9-4933-4004-97eb-669d61fade13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.575261 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb39a2b9-4933-4004-97eb-669d61fade13" (UID: "bb39a2b9-4933-4004-97eb-669d61fade13"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.583638 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb39a2b9-4933-4004-97eb-669d61fade13" (UID: "bb39a2b9-4933-4004-97eb-669d61fade13"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.605326 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-n2b6t"] Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.605400 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.605428 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.605438 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.605446 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.605456 4628 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb39a2b9-4933-4004-97eb-669d61fade13-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:46 crc kubenswrapper[4628]: W1211 05:31:46.622498 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38627c48_4a86_4721_874d_8f386ea24495.slice/crio-fe1f836871e8360ac99a4544a603fba688b30bd48c460659afb8be5e1d95bb68 WatchSource:0}: Error finding container fe1f836871e8360ac99a4544a603fba688b30bd48c460659afb8be5e1d95bb68: Status 404 returned error can't find the container with id fe1f836871e8360ac99a4544a603fba688b30bd48c460659afb8be5e1d95bb68 Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.674704 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-768bfb8b9-pbxmb"] Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.838014 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zvwzz"] Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.842081 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f8668ff77-t645d"] Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.853140 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v5zrj"] Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.887196 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-v5zrj"] Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.901266 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nd79k"] Dec 11 05:31:46 crc kubenswrapper[4628]: I1211 05:31:46.995584 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:31:47 crc kubenswrapper[4628]: W1211 05:31:47.012046 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod061f7965_d09e_4f01_9ee8_06638befdf0c.slice/crio-61a097379bc9fc7062390f35f4827683f5fb1c639d424b3ec5577de8215e99a9 WatchSource:0}: Error finding container 61a097379bc9fc7062390f35f4827683f5fb1c639d424b3ec5577de8215e99a9: Status 404 returned error can't find the container with id 61a097379bc9fc7062390f35f4827683f5fb1c639d424b3ec5577de8215e99a9 Dec 11 05:31:47 crc kubenswrapper[4628]: W1211 05:31:47.014774 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48b1c132_b854_4494_9e51_d934e9946366.slice/crio-055b42a696d531bb08da19ec1ebb0c86feb43deabf6aca8a08b2fd8bfddc67c7 WatchSource:0}: Error finding container 055b42a696d531bb08da19ec1ebb0c86feb43deabf6aca8a08b2fd8bfddc67c7: Status 404 returned error can't find the container with id 055b42a696d531bb08da19ec1ebb0c86feb43deabf6aca8a08b2fd8bfddc67c7 Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.026106 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-dlqkw"] Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.147793 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8b969"] Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.498168 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-768bfb8b9-pbxmb" event={"ID":"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1","Type":"ContainerStarted","Data":"c30d43104b6db752538dd0cd0bfa09e486ed29ddaf69a2c51740075058e44c77"} Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.511832 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n2b6t" event={"ID":"38627c48-4a86-4721-874d-8f386ea24495","Type":"ContainerStarted","Data":"fe1f836871e8360ac99a4544a603fba688b30bd48c460659afb8be5e1d95bb68"} Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.528092 4628 generic.go:334] "Generic (PLEG): container finished" podID="18c07bbf-1357-4d94-afb7-0620ac8222eb" containerID="0ece5ade3c64b6d0ac517add0a7a0900c09c98a16f3b6205ab40023ffbb43390" exitCode=0 Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.528172 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-58jzx" event={"ID":"18c07bbf-1357-4d94-afb7-0620ac8222eb","Type":"ContainerDied","Data":"0ece5ade3c64b6d0ac517add0a7a0900c09c98a16f3b6205ab40023ffbb43390"} Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.533795 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48b1c132-b854-4494-9e51-d934e9946366","Type":"ContainerStarted","Data":"055b42a696d531bb08da19ec1ebb0c86feb43deabf6aca8a08b2fd8bfddc67c7"} Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.535150 4628 generic.go:334] "Generic (PLEG): container finished" podID="061f7965-d09e-4f01-9ee8-06638befdf0c" containerID="714f134440ee888beb0da4181a1de1630acab07c653c5cc2dc91982416179464" exitCode=0 Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.535202 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" event={"ID":"061f7965-d09e-4f01-9ee8-06638befdf0c","Type":"ContainerDied","Data":"714f134440ee888beb0da4181a1de1630acab07c653c5cc2dc91982416179464"} Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.535223 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" event={"ID":"061f7965-d09e-4f01-9ee8-06638befdf0c","Type":"ContainerStarted","Data":"61a097379bc9fc7062390f35f4827683f5fb1c639d424b3ec5577de8215e99a9"} Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.537762 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nd79k" event={"ID":"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5","Type":"ContainerStarted","Data":"accaf55d0501467fb144c105fc0ce3f216ea988a05939b7ce52f0c835b7ba4f6"} Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.550897 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f8668ff77-t645d" event={"ID":"cfadba6b-22f6-471b-b328-c17e9b74e67a","Type":"ContainerStarted","Data":"16f6fc1a2f5ed49172d23205a16df42b6e5209f2604c000fad10bccde6a967f0"} Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.560380 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zvwzz" event={"ID":"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc","Type":"ContainerStarted","Data":"fdce9474a37802f20d7909fc13787fe95fe63b8a57ba9c19707a22b5c49b3a81"} Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.560420 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zvwzz" event={"ID":"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc","Type":"ContainerStarted","Data":"b82c9d502d882c9998cc9c3edaebf62d0ebc91265f44172a6c41c782cea2a718"} Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.562172 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8b969" event={"ID":"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c","Type":"ContainerStarted","Data":"f88493b1448a1c9f3a4c7bf996440c9d4e4427b134c06d4039b68f3a12e9a629"} Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.563136 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lqk5g" event={"ID":"0b98d4b7-cda3-44ff-87b6-8515b4ea082a","Type":"ContainerStarted","Data":"b02b72218dd222a768b93cfb0a78581959a67e389b45a4665d82c9724a3f6fca"} Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.612674 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lqk5g" podStartSLOduration=3.612652059 podStartE2EDuration="3.612652059s" podCreationTimestamp="2025-12-11 05:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:31:47.601151982 +0000 UTC m=+1010.018498680" watchObservedRunningTime="2025-12-11 05:31:47.612652059 +0000 UTC m=+1010.029998757" Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.640729 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-zvwzz" podStartSLOduration=2.640705373 podStartE2EDuration="2.640705373s" podCreationTimestamp="2025-12-11 05:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:31:47.63552813 +0000 UTC m=+1010.052874828" watchObservedRunningTime="2025-12-11 05:31:47.640705373 +0000 UTC m=+1010.058052071" Dec 11 05:31:47 crc kubenswrapper[4628]: I1211 05:31:47.941399 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb39a2b9-4933-4004-97eb-669d61fade13" path="/var/lib/kubelet/pods/bb39a2b9-4933-4004-97eb-669d61fade13/volumes" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.002780 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.037657 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-dns-svc\") pod \"18c07bbf-1357-4d94-afb7-0620ac8222eb\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.037738 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-config\") pod \"18c07bbf-1357-4d94-afb7-0620ac8222eb\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.037768 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-ovsdbserver-sb\") pod \"18c07bbf-1357-4d94-afb7-0620ac8222eb\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.037789 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-ovsdbserver-nb\") pod \"18c07bbf-1357-4d94-afb7-0620ac8222eb\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.037822 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l92lx\" (UniqueName: \"kubernetes.io/projected/18c07bbf-1357-4d94-afb7-0620ac8222eb-kube-api-access-l92lx\") pod \"18c07bbf-1357-4d94-afb7-0620ac8222eb\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.037905 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-dns-swift-storage-0\") pod \"18c07bbf-1357-4d94-afb7-0620ac8222eb\" (UID: \"18c07bbf-1357-4d94-afb7-0620ac8222eb\") " Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.051035 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c07bbf-1357-4d94-afb7-0620ac8222eb-kube-api-access-l92lx" (OuterVolumeSpecName: "kube-api-access-l92lx") pod "18c07bbf-1357-4d94-afb7-0620ac8222eb" (UID: "18c07bbf-1357-4d94-afb7-0620ac8222eb"). InnerVolumeSpecName "kube-api-access-l92lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.099433 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "18c07bbf-1357-4d94-afb7-0620ac8222eb" (UID: "18c07bbf-1357-4d94-afb7-0620ac8222eb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.132389 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-config" (OuterVolumeSpecName: "config") pod "18c07bbf-1357-4d94-afb7-0620ac8222eb" (UID: "18c07bbf-1357-4d94-afb7-0620ac8222eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.139981 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l92lx\" (UniqueName: \"kubernetes.io/projected/18c07bbf-1357-4d94-afb7-0620ac8222eb-kube-api-access-l92lx\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.140013 4628 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.140022 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.148786 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18c07bbf-1357-4d94-afb7-0620ac8222eb" (UID: "18c07bbf-1357-4d94-afb7-0620ac8222eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.154686 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "18c07bbf-1357-4d94-afb7-0620ac8222eb" (UID: "18c07bbf-1357-4d94-afb7-0620ac8222eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.170754 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "18c07bbf-1357-4d94-afb7-0620ac8222eb" (UID: "18c07bbf-1357-4d94-afb7-0620ac8222eb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.241744 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.241806 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.241819 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18c07bbf-1357-4d94-afb7-0620ac8222eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.520805 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-768bfb8b9-pbxmb"] Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.589613 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5bc894fc65-f69gf"] Dec 11 05:31:48 crc kubenswrapper[4628]: E1211 05:31:48.590028 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb39a2b9-4933-4004-97eb-669d61fade13" containerName="init" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.590043 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb39a2b9-4933-4004-97eb-669d61fade13" containerName="init" Dec 11 05:31:48 crc kubenswrapper[4628]: E1211 05:31:48.590092 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb39a2b9-4933-4004-97eb-669d61fade13" containerName="dnsmasq-dns" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.590099 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb39a2b9-4933-4004-97eb-669d61fade13" containerName="dnsmasq-dns" Dec 11 05:31:48 crc kubenswrapper[4628]: E1211 05:31:48.590112 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c07bbf-1357-4d94-afb7-0620ac8222eb" containerName="init" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.590117 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c07bbf-1357-4d94-afb7-0620ac8222eb" containerName="init" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.590337 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-58jzx" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.590364 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb39a2b9-4933-4004-97eb-669d61fade13" containerName="dnsmasq-dns" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.590388 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c07bbf-1357-4d94-afb7-0620ac8222eb" containerName="init" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.592018 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-58jzx" event={"ID":"18c07bbf-1357-4d94-afb7-0620ac8222eb","Type":"ContainerDied","Data":"35eff1a528c6e6154105d0fdab4625e837f79907f830d5b928ae36bfb4341252"} Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.592074 4628 scope.go:117] "RemoveContainer" containerID="0ece5ade3c64b6d0ac517add0a7a0900c09c98a16f3b6205ab40023ffbb43390" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.592232 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.607276 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" event={"ID":"061f7965-d09e-4f01-9ee8-06638befdf0c","Type":"ContainerStarted","Data":"f9f5853e3e8d41dccf4b7dd2da5f88496fb808528fa4f3aeac1360f1a200d080"} Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.609217 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.652048 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-logs\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.652113 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-scripts\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.652165 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-horizon-secret-key\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.652220 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-config-data\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.652239 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrnd\" (UniqueName: \"kubernetes.io/projected/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-kube-api-access-qsrnd\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.665984 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.681540 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bc894fc65-f69gf"] Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.693194 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" podStartSLOduration=3.693180312 podStartE2EDuration="3.693180312s" podCreationTimestamp="2025-12-11 05:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:31:48.687308141 +0000 UTC m=+1011.104654839" watchObservedRunningTime="2025-12-11 05:31:48.693180312 +0000 UTC m=+1011.110527010" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.747101 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-58jzx"] Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.753241 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-config-data\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.753271 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrnd\" (UniqueName: \"kubernetes.io/projected/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-kube-api-access-qsrnd\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.753348 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-logs\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.753374 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-scripts\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.753407 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-horizon-secret-key\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.755990 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-scripts\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.756454 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-logs\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.757964 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-config-data\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.766675 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-horizon-secret-key\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.769558 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-58jzx"] Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.803035 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrnd\" (UniqueName: \"kubernetes.io/projected/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-kube-api-access-qsrnd\") pod \"horizon-5bc894fc65-f69gf\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:48 crc kubenswrapper[4628]: I1211 05:31:48.929234 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:31:49 crc kubenswrapper[4628]: I1211 05:31:49.498398 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bc894fc65-f69gf"] Dec 11 05:31:49 crc kubenswrapper[4628]: I1211 05:31:49.634540 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bc894fc65-f69gf" event={"ID":"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7","Type":"ContainerStarted","Data":"e86dbf6da52e7e157bab6fb64585eeef0f668da0fe9d19a59048b287898d1663"} Dec 11 05:31:49 crc kubenswrapper[4628]: I1211 05:31:49.910329 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c07bbf-1357-4d94-afb7-0620ac8222eb" path="/var/lib/kubelet/pods/18c07bbf-1357-4d94-afb7-0620ac8222eb/volumes" Dec 11 05:31:50 crc kubenswrapper[4628]: I1211 05:31:50.678928 4628 generic.go:334] "Generic (PLEG): container finished" podID="98c407a5-95e8-4036-becd-3522286435d5" containerID="f5cb61ff69c4cc2f3f37f09a5f1c204ec02775c42134b14406f6a1af785698b2" exitCode=0 Dec 11 05:31:50 crc kubenswrapper[4628]: I1211 05:31:50.680227 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ksb9p" event={"ID":"98c407a5-95e8-4036-becd-3522286435d5","Type":"ContainerDied","Data":"f5cb61ff69c4cc2f3f37f09a5f1c204ec02775c42134b14406f6a1af785698b2"} Dec 11 05:31:50 crc kubenswrapper[4628]: I1211 05:31:50.850071 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-v5zrj" podUID="bb39a2b9-4933-4004-97eb-669d61fade13" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Dec 11 05:31:53 crc kubenswrapper[4628]: I1211 05:31:52.698595 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lqk5g" event={"ID":"0b98d4b7-cda3-44ff-87b6-8515b4ea082a","Type":"ContainerDied","Data":"b02b72218dd222a768b93cfb0a78581959a67e389b45a4665d82c9724a3f6fca"} Dec 11 05:31:53 crc kubenswrapper[4628]: I1211 05:31:52.698650 4628 generic.go:334] "Generic (PLEG): container finished" podID="0b98d4b7-cda3-44ff-87b6-8515b4ea082a" containerID="b02b72218dd222a768b93cfb0a78581959a67e389b45a4665d82c9724a3f6fca" exitCode=0 Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.406609 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f8668ff77-t645d"] Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.449488 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-66bdd9d8cd-mgd96"] Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.453656 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.456299 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.469269 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66bdd9d8cd-mgd96"] Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.496648 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d5ww\" (UniqueName: \"kubernetes.io/projected/8a3522a5-42e8-46ba-b794-d23582baa2a4-kube-api-access-8d5ww\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.496748 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a3522a5-42e8-46ba-b794-d23582baa2a4-scripts\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.496780 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-combined-ca-bundle\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.496843 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a3522a5-42e8-46ba-b794-d23582baa2a4-config-data\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.496890 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-horizon-tls-certs\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.497063 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-horizon-secret-key\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.497172 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3522a5-42e8-46ba-b794-d23582baa2a4-logs\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.543745 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5bc894fc65-f69gf"] Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.575768 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7989644c86-scmh4"] Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.582375 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7989644c86-scmh4"] Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.582481 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.602871 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-horizon-secret-key\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.602956 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3522a5-42e8-46ba-b794-d23582baa2a4-logs\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.602997 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d5ww\" (UniqueName: \"kubernetes.io/projected/8a3522a5-42e8-46ba-b794-d23582baa2a4-kube-api-access-8d5ww\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.603025 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a3522a5-42e8-46ba-b794-d23582baa2a4-scripts\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.603044 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-combined-ca-bundle\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.603611 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3522a5-42e8-46ba-b794-d23582baa2a4-logs\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.603696 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a3522a5-42e8-46ba-b794-d23582baa2a4-config-data\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.603720 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-horizon-tls-certs\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.606468 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a3522a5-42e8-46ba-b794-d23582baa2a4-scripts\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.607304 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a3522a5-42e8-46ba-b794-d23582baa2a4-config-data\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.608684 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-combined-ca-bundle\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.611243 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-horizon-secret-key\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.617963 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-horizon-tls-certs\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.705139 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51e02694-e634-4a3b-8406-3b3b72007c2b-config-data\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.705187 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9tl9\" (UniqueName: \"kubernetes.io/projected/51e02694-e634-4a3b-8406-3b3b72007c2b-kube-api-access-v9tl9\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.705239 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/51e02694-e634-4a3b-8406-3b3b72007c2b-horizon-tls-certs\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.705281 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/51e02694-e634-4a3b-8406-3b3b72007c2b-horizon-secret-key\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.705331 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51e02694-e634-4a3b-8406-3b3b72007c2b-combined-ca-bundle\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.705391 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51e02694-e634-4a3b-8406-3b3b72007c2b-scripts\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.705428 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51e02694-e634-4a3b-8406-3b3b72007c2b-logs\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.809269 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51e02694-e634-4a3b-8406-3b3b72007c2b-scripts\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.809497 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51e02694-e634-4a3b-8406-3b3b72007c2b-logs\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.809566 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51e02694-e634-4a3b-8406-3b3b72007c2b-config-data\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.809599 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9tl9\" (UniqueName: \"kubernetes.io/projected/51e02694-e634-4a3b-8406-3b3b72007c2b-kube-api-access-v9tl9\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.809639 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/51e02694-e634-4a3b-8406-3b3b72007c2b-horizon-tls-certs\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.809658 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/51e02694-e634-4a3b-8406-3b3b72007c2b-horizon-secret-key\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.810529 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51e02694-e634-4a3b-8406-3b3b72007c2b-scripts\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.809896 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51e02694-e634-4a3b-8406-3b3b72007c2b-combined-ca-bundle\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.811275 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51e02694-e634-4a3b-8406-3b3b72007c2b-logs\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.812502 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51e02694-e634-4a3b-8406-3b3b72007c2b-config-data\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.813765 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/51e02694-e634-4a3b-8406-3b3b72007c2b-horizon-secret-key\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.816545 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51e02694-e634-4a3b-8406-3b3b72007c2b-combined-ca-bundle\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.820151 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/51e02694-e634-4a3b-8406-3b3b72007c2b-horizon-tls-certs\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:54 crc kubenswrapper[4628]: I1211 05:31:54.990686 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d5ww\" (UniqueName: \"kubernetes.io/projected/8a3522a5-42e8-46ba-b794-d23582baa2a4-kube-api-access-8d5ww\") pod \"horizon-66bdd9d8cd-mgd96\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:55 crc kubenswrapper[4628]: I1211 05:31:55.015251 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9tl9\" (UniqueName: \"kubernetes.io/projected/51e02694-e634-4a3b-8406-3b3b72007c2b-kube-api-access-v9tl9\") pod \"horizon-7989644c86-scmh4\" (UID: \"51e02694-e634-4a3b-8406-3b3b72007c2b\") " pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:55 crc kubenswrapper[4628]: I1211 05:31:55.092361 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:31:55 crc kubenswrapper[4628]: I1211 05:31:55.095581 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:31:55 crc kubenswrapper[4628]: I1211 05:31:55.811068 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:31:55 crc kubenswrapper[4628]: I1211 05:31:55.867801 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-cdwch"] Dec 11 05:31:55 crc kubenswrapper[4628]: I1211 05:31:55.868063 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-cdwch" podUID="2f6eacbe-bd53-4695-9411-efe751202c1b" containerName="dnsmasq-dns" containerID="cri-o://f14bf99d128610949a878fd7fd299ac9e0d58370b422b1522bfa2f6179a73b35" gracePeriod=10 Dec 11 05:31:58 crc kubenswrapper[4628]: I1211 05:31:58.755936 4628 generic.go:334] "Generic (PLEG): container finished" podID="2f6eacbe-bd53-4695-9411-efe751202c1b" containerID="f14bf99d128610949a878fd7fd299ac9e0d58370b422b1522bfa2f6179a73b35" exitCode=0 Dec 11 05:31:58 crc kubenswrapper[4628]: I1211 05:31:58.756039 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-cdwch" event={"ID":"2f6eacbe-bd53-4695-9411-efe751202c1b","Type":"ContainerDied","Data":"f14bf99d128610949a878fd7fd299ac9e0d58370b422b1522bfa2f6179a73b35"} Dec 11 05:32:00 crc kubenswrapper[4628]: I1211 05:32:00.872419 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-cdwch" podUID="2f6eacbe-bd53-4695-9411-efe751202c1b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Dec 11 05:32:00 crc kubenswrapper[4628]: I1211 05:32:00.920831 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ksb9p" Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.011823 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-combined-ca-bundle\") pod \"98c407a5-95e8-4036-becd-3522286435d5\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.012063 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-config-data\") pod \"98c407a5-95e8-4036-becd-3522286435d5\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.012252 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fjdz\" (UniqueName: \"kubernetes.io/projected/98c407a5-95e8-4036-becd-3522286435d5-kube-api-access-2fjdz\") pod \"98c407a5-95e8-4036-becd-3522286435d5\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.012453 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-db-sync-config-data\") pod \"98c407a5-95e8-4036-becd-3522286435d5\" (UID: \"98c407a5-95e8-4036-becd-3522286435d5\") " Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.031020 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c407a5-95e8-4036-becd-3522286435d5-kube-api-access-2fjdz" (OuterVolumeSpecName: "kube-api-access-2fjdz") pod "98c407a5-95e8-4036-becd-3522286435d5" (UID: "98c407a5-95e8-4036-becd-3522286435d5"). InnerVolumeSpecName "kube-api-access-2fjdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.032746 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "98c407a5-95e8-4036-becd-3522286435d5" (UID: "98c407a5-95e8-4036-becd-3522286435d5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.064592 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98c407a5-95e8-4036-becd-3522286435d5" (UID: "98c407a5-95e8-4036-becd-3522286435d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.078673 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-config-data" (OuterVolumeSpecName: "config-data") pod "98c407a5-95e8-4036-becd-3522286435d5" (UID: "98c407a5-95e8-4036-becd-3522286435d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.115202 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.115239 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fjdz\" (UniqueName: \"kubernetes.io/projected/98c407a5-95e8-4036-becd-3522286435d5-kube-api-access-2fjdz\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.115251 4628 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.115261 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c407a5-95e8-4036-becd-3522286435d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.791470 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ksb9p" event={"ID":"98c407a5-95e8-4036-becd-3522286435d5","Type":"ContainerDied","Data":"b1a91a3e8c4f23beca2e685d8f0c55b214db6311f0fd290742cb8867f6d2e615"} Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.792341 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1a91a3e8c4f23beca2e685d8f0c55b214db6311f0fd290742cb8867f6d2e615" Dec 11 05:32:01 crc kubenswrapper[4628]: I1211 05:32:01.792248 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ksb9p" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.382831 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-xgptg"] Dec 11 05:32:02 crc kubenswrapper[4628]: E1211 05:32:02.383479 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c407a5-95e8-4036-becd-3522286435d5" containerName="glance-db-sync" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.383492 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c407a5-95e8-4036-becd-3522286435d5" containerName="glance-db-sync" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.383664 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c407a5-95e8-4036-becd-3522286435d5" containerName="glance-db-sync" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.384477 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.426933 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-xgptg"] Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.475614 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.475674 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.475698 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lwxf\" (UniqueName: \"kubernetes.io/projected/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-kube-api-access-8lwxf\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.475717 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.475740 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-config\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.475807 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.576966 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lwxf\" (UniqueName: \"kubernetes.io/projected/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-kube-api-access-8lwxf\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.577017 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.577049 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-config\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.577126 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.577213 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.577243 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.577885 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.577955 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.578078 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.578503 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.578857 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-config\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.641007 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lwxf\" (UniqueName: \"kubernetes.io/projected/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-kube-api-access-8lwxf\") pod \"dnsmasq-dns-785d8bcb8c-xgptg\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:02 crc kubenswrapper[4628]: I1211 05:32:02.709299 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.413120 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.416989 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.422392 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-92zmr" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.422818 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.423801 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.431138 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.492222 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb5497e-604f-4176-b214-0343c94e89c2-logs\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.492291 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbbp9\" (UniqueName: \"kubernetes.io/projected/6bb5497e-604f-4176-b214-0343c94e89c2-kube-api-access-pbbp9\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.492413 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.492504 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.492692 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.492790 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bb5497e-604f-4176-b214-0343c94e89c2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.492994 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.559010 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.560626 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.565491 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.577480 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.594472 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.594541 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bb5497e-604f-4176-b214-0343c94e89c2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.594602 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.594622 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb5497e-604f-4176-b214-0343c94e89c2-logs\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.594648 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbbp9\" (UniqueName: \"kubernetes.io/projected/6bb5497e-604f-4176-b214-0343c94e89c2-kube-api-access-pbbp9\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.594691 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.594719 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.595800 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.595877 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bb5497e-604f-4176-b214-0343c94e89c2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.595906 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb5497e-604f-4176-b214-0343c94e89c2-logs\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.601105 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.604839 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.605744 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.621631 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbbp9\" (UniqueName: \"kubernetes.io/projected/6bb5497e-604f-4176-b214-0343c94e89c2-kube-api-access-pbbp9\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.633515 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.696533 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.696901 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.697033 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.697102 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.697204 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-logs\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.697350 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jfht\" (UniqueName: \"kubernetes.io/projected/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-kube-api-access-6jfht\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.697574 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.749694 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.799228 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jfht\" (UniqueName: \"kubernetes.io/projected/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-kube-api-access-6jfht\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.799535 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.799731 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.800020 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.800406 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.800497 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.800608 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.800678 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-logs\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.801018 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-logs\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.803254 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.805838 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.810879 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.811746 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.823034 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jfht\" (UniqueName: \"kubernetes.io/projected/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-kube-api-access-6jfht\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.832268 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:03 crc kubenswrapper[4628]: I1211 05:32:03.877252 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:04 crc kubenswrapper[4628]: I1211 05:32:04.857071 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:32:04 crc kubenswrapper[4628]: I1211 05:32:04.940340 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:32:05 crc kubenswrapper[4628]: I1211 05:32:05.871942 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-cdwch" podUID="2f6eacbe-bd53-4695-9411-efe751202c1b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Dec 11 05:32:08 crc kubenswrapper[4628]: E1211 05:32:08.307208 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 11 05:32:08 crc kubenswrapper[4628]: E1211 05:32:08.307775 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b5h57chf6h57fh569h678h647h66ch67dh654h5bh559h5ffhcbh68fhbbhbh5dfh697h77h558h686h5b9h5f7h55dh5b4h664h698h558h5cfh5ddh674q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lxsxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-f8668ff77-t645d_openstack(cfadba6b-22f6-471b-b328-c17e9b74e67a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:32:08 crc kubenswrapper[4628]: E1211 05:32:08.322260 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-f8668ff77-t645d" podUID="cfadba6b-22f6-471b-b328-c17e9b74e67a" Dec 11 05:32:08 crc kubenswrapper[4628]: E1211 05:32:08.343222 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 11 05:32:08 crc kubenswrapper[4628]: E1211 05:32:08.343405 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n658h5bdh7ch5b6h5c7h5f5h99h57fh7ch67fh647hc8h67bh7hf9h58bh8bhd7h5c4h66dhdbh7bhcfhdh696hd7h579hf5h57dh68dh7dh9dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9lcxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-768bfb8b9-pbxmb_openstack(2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:32:08 crc kubenswrapper[4628]: E1211 05:32:08.348101 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-768bfb8b9-pbxmb" podUID="2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1" Dec 11 05:32:10 crc kubenswrapper[4628]: I1211 05:32:10.871820 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-cdwch" podUID="2f6eacbe-bd53-4695-9411-efe751202c1b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Dec 11 05:32:10 crc kubenswrapper[4628]: I1211 05:32:10.872172 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:32:11 crc kubenswrapper[4628]: E1211 05:32:11.154375 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 11 05:32:11 crc kubenswrapper[4628]: E1211 05:32:11.155700 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n65h7h684hbbh5b4h5f4hb6h554h95h554h545h5cch696h5b4h5f6h647h657h569h6ch67dh8chddh5hc7h589hdch59h7fh68bh5cbh549h697q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsrnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5bc894fc65-f69gf_openstack(7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:32:11 crc kubenswrapper[4628]: E1211 05:32:11.159574 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5bc894fc65-f69gf" podUID="7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.231309 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.351564 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ns64\" (UniqueName: \"kubernetes.io/projected/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-kube-api-access-6ns64\") pod \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.351636 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-credential-keys\") pod \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.351741 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-scripts\") pod \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.351760 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-fernet-keys\") pod \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.351778 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-config-data\") pod \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.351810 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-combined-ca-bundle\") pod \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\" (UID: \"0b98d4b7-cda3-44ff-87b6-8515b4ea082a\") " Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.370658 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0b98d4b7-cda3-44ff-87b6-8515b4ea082a" (UID: "0b98d4b7-cda3-44ff-87b6-8515b4ea082a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.377959 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-scripts" (OuterVolumeSpecName: "scripts") pod "0b98d4b7-cda3-44ff-87b6-8515b4ea082a" (UID: "0b98d4b7-cda3-44ff-87b6-8515b4ea082a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.394139 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-kube-api-access-6ns64" (OuterVolumeSpecName: "kube-api-access-6ns64") pod "0b98d4b7-cda3-44ff-87b6-8515b4ea082a" (UID: "0b98d4b7-cda3-44ff-87b6-8515b4ea082a"). InnerVolumeSpecName "kube-api-access-6ns64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.413140 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0b98d4b7-cda3-44ff-87b6-8515b4ea082a" (UID: "0b98d4b7-cda3-44ff-87b6-8515b4ea082a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.431251 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b98d4b7-cda3-44ff-87b6-8515b4ea082a" (UID: "0b98d4b7-cda3-44ff-87b6-8515b4ea082a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.432064 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-config-data" (OuterVolumeSpecName: "config-data") pod "0b98d4b7-cda3-44ff-87b6-8515b4ea082a" (UID: "0b98d4b7-cda3-44ff-87b6-8515b4ea082a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.453888 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ns64\" (UniqueName: \"kubernetes.io/projected/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-kube-api-access-6ns64\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.453921 4628 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.453931 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.453939 4628 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.453948 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.453957 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b98d4b7-cda3-44ff-87b6-8515b4ea082a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.890601 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lqk5g" Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.890671 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lqk5g" event={"ID":"0b98d4b7-cda3-44ff-87b6-8515b4ea082a","Type":"ContainerDied","Data":"6ff4b767e036c0b4ca2e5299c6a1a6a73ff8541f2efc0a0d82eb8fc118fd423f"} Dec 11 05:32:11 crc kubenswrapper[4628]: I1211 05:32:11.891568 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ff4b767e036c0b4ca2e5299c6a1a6a73ff8541f2efc0a0d82eb8fc118fd423f" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.329489 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lqk5g"] Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.342279 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lqk5g"] Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.421047 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-79fmw"] Dec 11 05:32:12 crc kubenswrapper[4628]: E1211 05:32:12.421465 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b98d4b7-cda3-44ff-87b6-8515b4ea082a" containerName="keystone-bootstrap" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.421479 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b98d4b7-cda3-44ff-87b6-8515b4ea082a" containerName="keystone-bootstrap" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.421674 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b98d4b7-cda3-44ff-87b6-8515b4ea082a" containerName="keystone-bootstrap" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.422320 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.425262 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.425476 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.425537 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.425560 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-58h9w" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.425588 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.430267 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-79fmw"] Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.475144 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-credential-keys\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.475186 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-fernet-keys\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.475210 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-scripts\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.475246 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-config-data\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.475260 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mrvg\" (UniqueName: \"kubernetes.io/projected/90c5df18-e257-4561-8148-8cebd4644e40-kube-api-access-6mrvg\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.475288 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-combined-ca-bundle\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.577369 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-credential-keys\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.577430 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-fernet-keys\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.577460 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-scripts\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.577481 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-config-data\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.577513 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mrvg\" (UniqueName: \"kubernetes.io/projected/90c5df18-e257-4561-8148-8cebd4644e40-kube-api-access-6mrvg\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.577547 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-combined-ca-bundle\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.583485 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-combined-ca-bundle\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.583642 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-credential-keys\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.584743 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-config-data\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.589551 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-fernet-keys\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.589795 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-scripts\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.594674 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mrvg\" (UniqueName: \"kubernetes.io/projected/90c5df18-e257-4561-8148-8cebd4644e40-kube-api-access-6mrvg\") pod \"keystone-bootstrap-79fmw\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:12 crc kubenswrapper[4628]: I1211 05:32:12.741471 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:13 crc kubenswrapper[4628]: I1211 05:32:13.900365 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b98d4b7-cda3-44ff-87b6-8515b4ea082a" path="/var/lib/kubelet/pods/0b98d4b7-cda3-44ff-87b6-8515b4ea082a/volumes" Dec 11 05:32:16 crc kubenswrapper[4628]: I1211 05:32:16.970354 4628 generic.go:334] "Generic (PLEG): container finished" podID="8c53cf2b-ce22-43f3-88fa-4a91ea4131bc" containerID="fdce9474a37802f20d7909fc13787fe95fe63b8a57ba9c19707a22b5c49b3a81" exitCode=0 Dec 11 05:32:16 crc kubenswrapper[4628]: I1211 05:32:16.970618 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zvwzz" event={"ID":"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc","Type":"ContainerDied","Data":"fdce9474a37802f20d7909fc13787fe95fe63b8a57ba9c19707a22b5c49b3a81"} Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.162979 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.176757 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.239485 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfadba6b-22f6-471b-b328-c17e9b74e67a-config-data\") pod \"cfadba6b-22f6-471b-b328-c17e9b74e67a\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.239547 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfadba6b-22f6-471b-b328-c17e9b74e67a-scripts\") pod \"cfadba6b-22f6-471b-b328-c17e9b74e67a\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.239608 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxsxm\" (UniqueName: \"kubernetes.io/projected/cfadba6b-22f6-471b-b328-c17e9b74e67a-kube-api-access-lxsxm\") pod \"cfadba6b-22f6-471b-b328-c17e9b74e67a\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.239984 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfadba6b-22f6-471b-b328-c17e9b74e67a-scripts" (OuterVolumeSpecName: "scripts") pod "cfadba6b-22f6-471b-b328-c17e9b74e67a" (UID: "cfadba6b-22f6-471b-b328-c17e9b74e67a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.240090 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfadba6b-22f6-471b-b328-c17e9b74e67a-config-data" (OuterVolumeSpecName: "config-data") pod "cfadba6b-22f6-471b-b328-c17e9b74e67a" (UID: "cfadba6b-22f6-471b-b328-c17e9b74e67a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.240632 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-logs" (OuterVolumeSpecName: "logs") pod "2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1" (UID: "2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.240063 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-logs\") pod \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.241003 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cfadba6b-22f6-471b-b328-c17e9b74e67a-horizon-secret-key\") pod \"cfadba6b-22f6-471b-b328-c17e9b74e67a\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.241041 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-horizon-secret-key\") pod \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.241071 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfadba6b-22f6-471b-b328-c17e9b74e67a-logs\") pod \"cfadba6b-22f6-471b-b328-c17e9b74e67a\" (UID: \"cfadba6b-22f6-471b-b328-c17e9b74e67a\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.241094 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-config-data\") pod \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.241235 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-scripts\") pod \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.241271 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lcxk\" (UniqueName: \"kubernetes.io/projected/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-kube-api-access-9lcxk\") pod \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\" (UID: \"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.242218 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-config-data" (OuterVolumeSpecName: "config-data") pod "2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1" (UID: "2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.242299 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-scripts" (OuterVolumeSpecName: "scripts") pod "2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1" (UID: "2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.242427 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cfadba6b-22f6-471b-b328-c17e9b74e67a-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.242450 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cfadba6b-22f6-471b-b328-c17e9b74e67a-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.242460 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.242789 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfadba6b-22f6-471b-b328-c17e9b74e67a-logs" (OuterVolumeSpecName: "logs") pod "cfadba6b-22f6-471b-b328-c17e9b74e67a" (UID: "cfadba6b-22f6-471b-b328-c17e9b74e67a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.245713 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfadba6b-22f6-471b-b328-c17e9b74e67a-kube-api-access-lxsxm" (OuterVolumeSpecName: "kube-api-access-lxsxm") pod "cfadba6b-22f6-471b-b328-c17e9b74e67a" (UID: "cfadba6b-22f6-471b-b328-c17e9b74e67a"). InnerVolumeSpecName "kube-api-access-lxsxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.249796 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfadba6b-22f6-471b-b328-c17e9b74e67a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "cfadba6b-22f6-471b-b328-c17e9b74e67a" (UID: "cfadba6b-22f6-471b-b328-c17e9b74e67a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.250796 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1" (UID: "2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.254457 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-kube-api-access-9lcxk" (OuterVolumeSpecName: "kube-api-access-9lcxk") pod "2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1" (UID: "2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1"). InnerVolumeSpecName "kube-api-access-9lcxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.343703 4628 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cfadba6b-22f6-471b-b328-c17e9b74e67a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.343730 4628 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.343740 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfadba6b-22f6-471b-b328-c17e9b74e67a-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.343749 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.343757 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.343765 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lcxk\" (UniqueName: \"kubernetes.io/projected/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1-kube-api-access-9lcxk\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.343774 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxsxm\" (UniqueName: \"kubernetes.io/projected/cfadba6b-22f6-471b-b328-c17e9b74e67a-kube-api-access-lxsxm\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:20 crc kubenswrapper[4628]: E1211 05:32:20.752356 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 11 05:32:20 crc kubenswrapper[4628]: E1211 05:32:20.752564 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n56chf5hfh5bbh658h575h5dbh64ch5b8h679h9dh567h696h664h557hf4h5d9h57fh74h55ch5f6h5bch5b4h64fh586h56fh687h5b5h4h578h69hcdq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njt7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(48b1c132-b854-4494-9e51-d934e9946366): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.827047 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.829170 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.837271 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.908069 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-config-data\") pod \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.908133 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-horizon-secret-key\") pod \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.908213 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-dns-svc\") pod \"2f6eacbe-bd53-4695-9411-efe751202c1b\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.908250 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-ovsdbserver-sb\") pod \"2f6eacbe-bd53-4695-9411-efe751202c1b\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.908277 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-ovsdbserver-nb\") pod \"2f6eacbe-bd53-4695-9411-efe751202c1b\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.908303 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxqj4\" (UniqueName: \"kubernetes.io/projected/2f6eacbe-bd53-4695-9411-efe751202c1b-kube-api-access-kxqj4\") pod \"2f6eacbe-bd53-4695-9411-efe751202c1b\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.908328 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-config\") pod \"2f6eacbe-bd53-4695-9411-efe751202c1b\" (UID: \"2f6eacbe-bd53-4695-9411-efe751202c1b\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.908375 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-logs\") pod \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.908427 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-scripts\") pod \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.908471 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsrnd\" (UniqueName: \"kubernetes.io/projected/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-kube-api-access-qsrnd\") pod \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\" (UID: \"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7\") " Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.914432 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-kube-api-access-qsrnd" (OuterVolumeSpecName: "kube-api-access-qsrnd") pod "7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7" (UID: "7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7"). InnerVolumeSpecName "kube-api-access-qsrnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.920436 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-cdwch" podUID="2f6eacbe-bd53-4695-9411-efe751202c1b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.922314 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-logs" (OuterVolumeSpecName: "logs") pod "7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7" (UID: "7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.923102 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-scripts" (OuterVolumeSpecName: "scripts") pod "7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7" (UID: "7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.923232 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f6eacbe-bd53-4695-9411-efe751202c1b-kube-api-access-kxqj4" (OuterVolumeSpecName: "kube-api-access-kxqj4") pod "2f6eacbe-bd53-4695-9411-efe751202c1b" (UID: "2f6eacbe-bd53-4695-9411-efe751202c1b"). InnerVolumeSpecName "kube-api-access-kxqj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.924350 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-config-data" (OuterVolumeSpecName: "config-data") pod "7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7" (UID: "7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.927348 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7" (UID: "7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.983942 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-config" (OuterVolumeSpecName: "config") pod "2f6eacbe-bd53-4695-9411-efe751202c1b" (UID: "2f6eacbe-bd53-4695-9411-efe751202c1b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:20 crc kubenswrapper[4628]: I1211 05:32:20.995237 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f6eacbe-bd53-4695-9411-efe751202c1b" (UID: "2f6eacbe-bd53-4695-9411-efe751202c1b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.008099 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f6eacbe-bd53-4695-9411-efe751202c1b" (UID: "2f6eacbe-bd53-4695-9411-efe751202c1b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.009292 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bc894fc65-f69gf" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.009288 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bc894fc65-f69gf" event={"ID":"7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7","Type":"ContainerDied","Data":"e86dbf6da52e7e157bab6fb64585eeef0f668da0fe9d19a59048b287898d1663"} Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.010067 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-combined-ca-bundle\") pod \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\" (UID: \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\") " Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.010145 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-config\") pod \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\" (UID: \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\") " Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.010354 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2x4d\" (UniqueName: \"kubernetes.io/projected/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-kube-api-access-k2x4d\") pod \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\" (UID: \"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc\") " Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.011017 4628 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.011035 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.011045 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.011075 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxqj4\" (UniqueName: \"kubernetes.io/projected/2f6eacbe-bd53-4695-9411-efe751202c1b-kube-api-access-kxqj4\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.011086 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.011094 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.011130 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.011139 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsrnd\" (UniqueName: \"kubernetes.io/projected/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-kube-api-access-qsrnd\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.011147 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.012977 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-768bfb8b9-pbxmb" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.013079 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-768bfb8b9-pbxmb" event={"ID":"2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1","Type":"ContainerDied","Data":"c30d43104b6db752538dd0cd0bfa09e486ed29ddaf69a2c51740075058e44c77"} Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.014919 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-kube-api-access-k2x4d" (OuterVolumeSpecName: "kube-api-access-k2x4d") pod "8c53cf2b-ce22-43f3-88fa-4a91ea4131bc" (UID: "8c53cf2b-ce22-43f3-88fa-4a91ea4131bc"). InnerVolumeSpecName "kube-api-access-k2x4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.020916 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f6eacbe-bd53-4695-9411-efe751202c1b" (UID: "2f6eacbe-bd53-4695-9411-efe751202c1b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.023221 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-cdwch" event={"ID":"2f6eacbe-bd53-4695-9411-efe751202c1b","Type":"ContainerDied","Data":"7ccd27ea8702017b74583963f965a334b4f250fa1153e6d6bdc1e1c5490f1f2d"} Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.023284 4628 scope.go:117] "RemoveContainer" containerID="f14bf99d128610949a878fd7fd299ac9e0d58370b422b1522bfa2f6179a73b35" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.023445 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-cdwch" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.027694 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zvwzz" event={"ID":"8c53cf2b-ce22-43f3-88fa-4a91ea4131bc","Type":"ContainerDied","Data":"b82c9d502d882c9998cc9c3edaebf62d0ebc91265f44172a6c41c782cea2a718"} Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.027741 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b82c9d502d882c9998cc9c3edaebf62d0ebc91265f44172a6c41c782cea2a718" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.027795 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zvwzz" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.029951 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f8668ff77-t645d" event={"ID":"cfadba6b-22f6-471b-b328-c17e9b74e67a","Type":"ContainerDied","Data":"16f6fc1a2f5ed49172d23205a16df42b6e5209f2604c000fad10bccde6a967f0"} Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.029999 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f8668ff77-t645d" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.048115 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-config" (OuterVolumeSpecName: "config") pod "8c53cf2b-ce22-43f3-88fa-4a91ea4131bc" (UID: "8c53cf2b-ce22-43f3-88fa-4a91ea4131bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.051026 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c53cf2b-ce22-43f3-88fa-4a91ea4131bc" (UID: "8c53cf2b-ce22-43f3-88fa-4a91ea4131bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.113120 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2x4d\" (UniqueName: \"kubernetes.io/projected/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-kube-api-access-k2x4d\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.113154 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f6eacbe-bd53-4695-9411-efe751202c1b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.113165 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.113173 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.195668 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f8668ff77-t645d"] Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.220147 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f8668ff77-t645d"] Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.227125 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-cdwch"] Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.256110 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-cdwch"] Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.296822 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5bc894fc65-f69gf"] Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.304787 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5bc894fc65-f69gf"] Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.317664 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-768bfb8b9-pbxmb"] Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.323419 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-768bfb8b9-pbxmb"] Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.898752 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1" path="/var/lib/kubelet/pods/2b5f207f-36c2-4ca8-a2e5-4ff7366b71a1/volumes" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.899137 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f6eacbe-bd53-4695-9411-efe751202c1b" path="/var/lib/kubelet/pods/2f6eacbe-bd53-4695-9411-efe751202c1b/volumes" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.900089 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7" path="/var/lib/kubelet/pods/7e484d5d-1b6c-4ab2-8e1b-b921e9dc9ce7/volumes" Dec 11 05:32:21 crc kubenswrapper[4628]: I1211 05:32:21.900490 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfadba6b-22f6-471b-b328-c17e9b74e67a" path="/var/lib/kubelet/pods/cfadba6b-22f6-471b-b328-c17e9b74e67a/volumes" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.111272 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-xgptg"] Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.163586 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f4cbc8496-gqwsj"] Dec 11 05:32:22 crc kubenswrapper[4628]: E1211 05:32:22.163979 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f6eacbe-bd53-4695-9411-efe751202c1b" containerName="init" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.163998 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f6eacbe-bd53-4695-9411-efe751202c1b" containerName="init" Dec 11 05:32:22 crc kubenswrapper[4628]: E1211 05:32:22.164012 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f6eacbe-bd53-4695-9411-efe751202c1b" containerName="dnsmasq-dns" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.164019 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f6eacbe-bd53-4695-9411-efe751202c1b" containerName="dnsmasq-dns" Dec 11 05:32:22 crc kubenswrapper[4628]: E1211 05:32:22.164038 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c53cf2b-ce22-43f3-88fa-4a91ea4131bc" containerName="neutron-db-sync" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.164044 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c53cf2b-ce22-43f3-88fa-4a91ea4131bc" containerName="neutron-db-sync" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.164200 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c53cf2b-ce22-43f3-88fa-4a91ea4131bc" containerName="neutron-db-sync" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.164218 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f6eacbe-bd53-4695-9411-efe751202c1b" containerName="dnsmasq-dns" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.165044 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.170609 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.170640 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.170806 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.170821 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-gz72p" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.188789 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-lh2sk"] Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.190474 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.205816 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f4cbc8496-gqwsj"] Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.228479 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-lh2sk"] Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.246330 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-combined-ca-bundle\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.246492 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbpx\" (UniqueName: \"kubernetes.io/projected/32a1d94f-dcff-4648-8b3e-0b54cf493211-kube-api-access-cvbpx\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.246606 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-config\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.246657 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-ovndb-tls-certs\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.246779 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-httpd-config\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.348630 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.348693 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-config\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.348725 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztnpl\" (UniqueName: \"kubernetes.io/projected/36ef8b87-70d5-4803-a864-19cde1a04b87-kube-api-access-ztnpl\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.348743 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.348833 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-ovndb-tls-certs\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.348900 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.348927 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-httpd-config\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.348951 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-config\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.348982 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-combined-ca-bundle\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.349000 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-dns-svc\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.349036 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbpx\" (UniqueName: \"kubernetes.io/projected/32a1d94f-dcff-4648-8b3e-0b54cf493211-kube-api-access-cvbpx\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.354217 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-ovndb-tls-certs\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.354217 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-combined-ca-bundle\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.354260 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-httpd-config\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.367423 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbpx\" (UniqueName: \"kubernetes.io/projected/32a1d94f-dcff-4648-8b3e-0b54cf493211-kube-api-access-cvbpx\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.369400 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-config\") pod \"neutron-7f4cbc8496-gqwsj\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.450106 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-dns-svc\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.450197 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.450237 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztnpl\" (UniqueName: \"kubernetes.io/projected/36ef8b87-70d5-4803-a864-19cde1a04b87-kube-api-access-ztnpl\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.450256 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.450305 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.450341 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-config\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.451054 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-dns-svc\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.451105 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.451165 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-config\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.451238 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.451599 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.468209 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztnpl\" (UniqueName: \"kubernetes.io/projected/36ef8b87-70d5-4803-a864-19cde1a04b87-kube-api-access-ztnpl\") pod \"dnsmasq-dns-55f844cf75-lh2sk\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.496117 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:22 crc kubenswrapper[4628]: I1211 05:32:22.511711 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:24 crc kubenswrapper[4628]: E1211 05:32:24.494099 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 11 05:32:24 crc kubenswrapper[4628]: E1211 05:32:24.494482 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hp4x5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-n2b6t_openstack(38627c48-4a86-4721-874d-8f386ea24495): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 05:32:24 crc kubenswrapper[4628]: E1211 05:32:24.495813 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-n2b6t" podUID="38627c48-4a86-4721-874d-8f386ea24495" Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.570764 4628 scope.go:117] "RemoveContainer" containerID="37e5f8a8a3dce4af5c0d900bf0f8d9303fdecb84fd394e95b2b5df0760f89754" Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.851425 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67df497849-l9zzv"] Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.876616 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.891980 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.892249 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.934818 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67df497849-l9zzv"] Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.994507 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-config\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.994715 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-ovndb-tls-certs\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.994784 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-httpd-config\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.994908 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-public-tls-certs\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.994985 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-internal-tls-certs\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.995070 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-combined-ca-bundle\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:24 crc kubenswrapper[4628]: I1211 05:32:24.995155 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46rkh\" (UniqueName: \"kubernetes.io/projected/9d505505-13f5-4899-b1a5-7f739066e73c-kube-api-access-46rkh\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: E1211 05:32:25.086328 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-n2b6t" podUID="38627c48-4a86-4721-874d-8f386ea24495" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.096301 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-ovndb-tls-certs\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.096344 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-httpd-config\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.096373 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-public-tls-certs\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.096395 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-internal-tls-certs\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.096427 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-combined-ca-bundle\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.096455 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46rkh\" (UniqueName: \"kubernetes.io/projected/9d505505-13f5-4899-b1a5-7f739066e73c-kube-api-access-46rkh\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.096507 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-config\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.104101 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-config\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.114853 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-combined-ca-bundle\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.118364 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-public-tls-certs\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.118825 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-internal-tls-certs\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.119472 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-httpd-config\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.127419 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d505505-13f5-4899-b1a5-7f739066e73c-ovndb-tls-certs\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.150336 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46rkh\" (UniqueName: \"kubernetes.io/projected/9d505505-13f5-4899-b1a5-7f739066e73c-kube-api-access-46rkh\") pod \"neutron-67df497849-l9zzv\" (UID: \"9d505505-13f5-4899-b1a5-7f739066e73c\") " pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.164771 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7989644c86-scmh4"] Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.180229 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66bdd9d8cd-mgd96"] Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.239882 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.376127 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-xgptg"] Dec 11 05:32:25 crc kubenswrapper[4628]: W1211 05:32:25.397496 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00ce84fb_3b4a_4320_bb3a_5c66ce7eb43a.slice/crio-2b604221cfe7017d4f21c8aa6a37c144accd442b46824b085c2046c8fcba12be WatchSource:0}: Error finding container 2b604221cfe7017d4f21c8aa6a37c144accd442b46824b085c2046c8fcba12be: Status 404 returned error can't find the container with id 2b604221cfe7017d4f21c8aa6a37c144accd442b46824b085c2046c8fcba12be Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.580105 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-lh2sk"] Dec 11 05:32:25 crc kubenswrapper[4628]: W1211 05:32:25.613874 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90cbd5ab_58ac_49cf_ba49_e8bf84ec425d.slice/crio-971e59600f1d10db94f5b53d2b32a37607ccc3e4e4b24ca3f402107cb66f5526 WatchSource:0}: Error finding container 971e59600f1d10db94f5b53d2b32a37607ccc3e4e4b24ca3f402107cb66f5526: Status 404 returned error can't find the container with id 971e59600f1d10db94f5b53d2b32a37607ccc3e4e4b24ca3f402107cb66f5526 Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.620169 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.649995 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-79fmw"] Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.727102 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f4cbc8496-gqwsj"] Dec 11 05:32:25 crc kubenswrapper[4628]: I1211 05:32:25.922102 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67df497849-l9zzv"] Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.105498 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d","Type":"ContainerStarted","Data":"971e59600f1d10db94f5b53d2b32a37607ccc3e4e4b24ca3f402107cb66f5526"} Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.109319 4628 generic.go:334] "Generic (PLEG): container finished" podID="00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a" containerID="04b5b557030fd060e472a34748b1f6f5605efa0a0c8bcaf63a0c1de018282dd7" exitCode=0 Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.109495 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" event={"ID":"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a","Type":"ContainerDied","Data":"04b5b557030fd060e472a34748b1f6f5605efa0a0c8bcaf63a0c1de018282dd7"} Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.109512 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" event={"ID":"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a","Type":"ContainerStarted","Data":"2b604221cfe7017d4f21c8aa6a37c144accd442b46824b085c2046c8fcba12be"} Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.113552 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66bdd9d8cd-mgd96" event={"ID":"8a3522a5-42e8-46ba-b794-d23582baa2a4","Type":"ContainerStarted","Data":"3e1b4981b4e3cf13b4acdc8cdbcd9a33f424e83ef5decc99c1a62f09e96cd81d"} Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.118981 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67df497849-l9zzv" event={"ID":"9d505505-13f5-4899-b1a5-7f739066e73c","Type":"ContainerStarted","Data":"833e8373e468557bd5669cc8bd5a18a54f8560560abdd71a8fb6da3e28b6c5f6"} Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.123984 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f4cbc8496-gqwsj" event={"ID":"32a1d94f-dcff-4648-8b3e-0b54cf493211","Type":"ContainerStarted","Data":"e9326b9c6124a11357a334c5cc7389070bd8e358da272bffbce67fa77ca421c9"} Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.153012 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" event={"ID":"36ef8b87-70d5-4803-a864-19cde1a04b87","Type":"ContainerStarted","Data":"35b2805a276feea8e1206590cd897e21d12830ac9f16d8daa13112296f352af8"} Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.170602 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8b969" event={"ID":"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c","Type":"ContainerStarted","Data":"4d9ea5e68dc77d1e29479862cc468b74fd29f30f0baa8471daf7986a7f91870f"} Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.182240 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-79fmw" event={"ID":"90c5df18-e257-4561-8148-8cebd4644e40","Type":"ContainerStarted","Data":"76dbd1452f45a8acdca8a7fe11bd6a62b923ed6fa51e9b2c247c72f1db2d91fb"} Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.193981 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nd79k" event={"ID":"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5","Type":"ContainerStarted","Data":"f36432b45933f8a8ac33dbaa563e189156370f2eaac211d62b706a9225ef3346"} Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.210283 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7989644c86-scmh4" event={"ID":"51e02694-e634-4a3b-8406-3b3b72007c2b","Type":"ContainerStarted","Data":"c8fe938758eb8ac553d217f6bcc8f1434572733db8655f03dfc33787acd92120"} Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.225633 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8b969" podStartSLOduration=7.629258851 podStartE2EDuration="41.225617805s" podCreationTimestamp="2025-12-11 05:31:45 +0000 UTC" firstStartedPulling="2025-12-11 05:31:47.160054535 +0000 UTC m=+1009.577401223" lastFinishedPulling="2025-12-11 05:32:20.756413479 +0000 UTC m=+1043.173760177" observedRunningTime="2025-12-11 05:32:26.205436607 +0000 UTC m=+1048.622783305" watchObservedRunningTime="2025-12-11 05:32:26.225617805 +0000 UTC m=+1048.642964513" Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.254695 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-79fmw" podStartSLOduration=14.254674425 podStartE2EDuration="14.254674425s" podCreationTimestamp="2025-12-11 05:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:26.237401756 +0000 UTC m=+1048.654748454" watchObservedRunningTime="2025-12-11 05:32:26.254674425 +0000 UTC m=+1048.672021123" Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.279285 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-nd79k" podStartSLOduration=7.376821344 podStartE2EDuration="41.279269623s" podCreationTimestamp="2025-12-11 05:31:45 +0000 UTC" firstStartedPulling="2025-12-11 05:31:46.869996576 +0000 UTC m=+1009.287343264" lastFinishedPulling="2025-12-11 05:32:20.772444845 +0000 UTC m=+1043.189791543" observedRunningTime="2025-12-11 05:32:26.259743323 +0000 UTC m=+1048.677090021" watchObservedRunningTime="2025-12-11 05:32:26.279269623 +0000 UTC m=+1048.696616321" Dec 11 05:32:26 crc kubenswrapper[4628]: I1211 05:32:26.754299 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.219887 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7989644c86-scmh4" event={"ID":"51e02694-e634-4a3b-8406-3b3b72007c2b","Type":"ContainerStarted","Data":"3ad6e654edc9b249b80fe88b77387b31f9f3a747e9bd9680476ce34b21b85d8d"} Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.222484 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f4cbc8496-gqwsj" event={"ID":"32a1d94f-dcff-4648-8b3e-0b54cf493211","Type":"ContainerStarted","Data":"ce845a4eb1c022bffc53c8bb0332d1e2eeb8cd5361b82915aba90b6b40ff16f9"} Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.222528 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f4cbc8496-gqwsj" event={"ID":"32a1d94f-dcff-4648-8b3e-0b54cf493211","Type":"ContainerStarted","Data":"397390f28ff14aee582539906e70c1bfcf7d07ac5cf0aa724a9d0e6e5033daab"} Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.223645 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.225543 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66bdd9d8cd-mgd96" event={"ID":"8a3522a5-42e8-46ba-b794-d23582baa2a4","Type":"ContainerStarted","Data":"b60d821722cdfc3fd82f6785ddf1b0a7349d9bd58013594052c7fa0d037fb3be"} Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.231210 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67df497849-l9zzv" event={"ID":"9d505505-13f5-4899-b1a5-7f739066e73c","Type":"ContainerStarted","Data":"c9d4b30e977d2e8dd74eaf817ea807d8ae127c99233de910aa92ed6c8347bc4b"} Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.233002 4628 generic.go:334] "Generic (PLEG): container finished" podID="36ef8b87-70d5-4803-a864-19cde1a04b87" containerID="81d6e31888c76b189e3f1400f54dcbe72b649997247dd4879cc00d1836000919" exitCode=0 Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.233067 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" event={"ID":"36ef8b87-70d5-4803-a864-19cde1a04b87","Type":"ContainerDied","Data":"81d6e31888c76b189e3f1400f54dcbe72b649997247dd4879cc00d1836000919"} Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.237261 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-79fmw" event={"ID":"90c5df18-e257-4561-8148-8cebd4644e40","Type":"ContainerStarted","Data":"2f25c4c977c6a6e912a4fa4df7d2d12e1e16464d0b916865215eb5faffa036fe"} Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.239579 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d","Type":"ContainerStarted","Data":"3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808"} Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.250536 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f4cbc8496-gqwsj" podStartSLOduration=5.25052176 podStartE2EDuration="5.25052176s" podCreationTimestamp="2025-12-11 05:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:27.246252704 +0000 UTC m=+1049.663599402" watchObservedRunningTime="2025-12-11 05:32:27.25052176 +0000 UTC m=+1049.667868458" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.626484 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.730936 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-dns-svc\") pod \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.731073 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-ovsdbserver-sb\") pod \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.731106 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lwxf\" (UniqueName: \"kubernetes.io/projected/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-kube-api-access-8lwxf\") pod \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.731243 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-config\") pod \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.731274 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-ovsdbserver-nb\") pod \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.731328 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-dns-swift-storage-0\") pod \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\" (UID: \"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a\") " Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.743579 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-kube-api-access-8lwxf" (OuterVolumeSpecName: "kube-api-access-8lwxf") pod "00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a" (UID: "00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a"). InnerVolumeSpecName "kube-api-access-8lwxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.796104 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a" (UID: "00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.815049 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a" (UID: "00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.825940 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-config" (OuterVolumeSpecName: "config") pod "00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a" (UID: "00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.826581 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a" (UID: "00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.833349 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.833378 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.833389 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.833397 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.833420 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lwxf\" (UniqueName: \"kubernetes.io/projected/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-kube-api-access-8lwxf\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.836755 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a" (UID: "00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:27 crc kubenswrapper[4628]: I1211 05:32:27.935485 4628 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:28 crc kubenswrapper[4628]: I1211 05:32:28.260714 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" event={"ID":"00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a","Type":"ContainerDied","Data":"2b604221cfe7017d4f21c8aa6a37c144accd442b46824b085c2046c8fcba12be"} Dec 11 05:32:28 crc kubenswrapper[4628]: I1211 05:32:28.261006 4628 scope.go:117] "RemoveContainer" containerID="04b5b557030fd060e472a34748b1f6f5605efa0a0c8bcaf63a0c1de018282dd7" Dec 11 05:32:28 crc kubenswrapper[4628]: I1211 05:32:28.260766 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-xgptg" Dec 11 05:32:28 crc kubenswrapper[4628]: I1211 05:32:28.267801 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6bb5497e-604f-4176-b214-0343c94e89c2","Type":"ContainerStarted","Data":"7bf592c9ccc209f257b6f3d72ad73c3c4657708898f06ea549c9f3c4b7193918"} Dec 11 05:32:28 crc kubenswrapper[4628]: I1211 05:32:28.368274 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-xgptg"] Dec 11 05:32:28 crc kubenswrapper[4628]: I1211 05:32:28.376830 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-xgptg"] Dec 11 05:32:29 crc kubenswrapper[4628]: I1211 05:32:29.902645 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a" path="/var/lib/kubelet/pods/00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a/volumes" Dec 11 05:32:30 crc kubenswrapper[4628]: I1211 05:32:30.795355 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e4498a18-7449-45b3-9061-d3ffbfa4be5b" containerName="galera" probeResult="failure" output="command timed out" Dec 11 05:32:30 crc kubenswrapper[4628]: I1211 05:32:30.795419 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e4498a18-7449-45b3-9061-d3ffbfa4be5b" containerName="galera" probeResult="failure" output="command timed out" Dec 11 05:32:31 crc kubenswrapper[4628]: I1211 05:32:31.306542 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6bb5497e-604f-4176-b214-0343c94e89c2","Type":"ContainerStarted","Data":"f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413"} Dec 11 05:32:31 crc kubenswrapper[4628]: I1211 05:32:31.312958 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66bdd9d8cd-mgd96" event={"ID":"8a3522a5-42e8-46ba-b794-d23582baa2a4","Type":"ContainerStarted","Data":"a746c66aade8058642983deeedde27bfe41ebbcf4cc43d9cec8d1a2cd699c9e3"} Dec 11 05:32:31 crc kubenswrapper[4628]: I1211 05:32:31.336555 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" event={"ID":"36ef8b87-70d5-4803-a864-19cde1a04b87","Type":"ContainerStarted","Data":"c844db229559a48791024563eb13063999c90e5c8aff7101cb52ccbaf156c0d6"} Dec 11 05:32:31 crc kubenswrapper[4628]: I1211 05:32:31.337687 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:31 crc kubenswrapper[4628]: I1211 05:32:31.382514 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-66bdd9d8cd-mgd96" podStartSLOduration=36.726487263 podStartE2EDuration="37.382497299s" podCreationTimestamp="2025-12-11 05:31:54 +0000 UTC" firstStartedPulling="2025-12-11 05:32:25.19588548 +0000 UTC m=+1047.613232178" lastFinishedPulling="2025-12-11 05:32:25.851895526 +0000 UTC m=+1048.269242214" observedRunningTime="2025-12-11 05:32:31.355902877 +0000 UTC m=+1053.773249585" watchObservedRunningTime="2025-12-11 05:32:31.382497299 +0000 UTC m=+1053.799843987" Dec 11 05:32:31 crc kubenswrapper[4628]: I1211 05:32:31.387231 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" podStartSLOduration=9.387218787 podStartE2EDuration="9.387218787s" podCreationTimestamp="2025-12-11 05:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:31.371384527 +0000 UTC m=+1053.788731245" watchObservedRunningTime="2025-12-11 05:32:31.387218787 +0000 UTC m=+1053.804565485" Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.349349 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67df497849-l9zzv" event={"ID":"9d505505-13f5-4899-b1a5-7f739066e73c","Type":"ContainerStarted","Data":"056021ab8edda3d821e2ece209c3e1c374bfeede1ae2be2f123dfc0f5121801f"} Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.350132 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.353352 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6bb5497e-604f-4176-b214-0343c94e89c2","Type":"ContainerStarted","Data":"f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae"} Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.353463 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6bb5497e-604f-4176-b214-0343c94e89c2" containerName="glance-log" containerID="cri-o://f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413" gracePeriod=30 Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.353682 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6bb5497e-604f-4176-b214-0343c94e89c2" containerName="glance-httpd" containerID="cri-o://f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae" gracePeriod=30 Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.356810 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d","Type":"ContainerStarted","Data":"0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299"} Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.356873 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" containerName="glance-log" containerID="cri-o://3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808" gracePeriod=30 Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.356894 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" containerName="glance-httpd" containerID="cri-o://0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299" gracePeriod=30 Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.359226 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48b1c132-b854-4494-9e51-d934e9946366","Type":"ContainerStarted","Data":"d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2"} Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.363878 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7989644c86-scmh4" event={"ID":"51e02694-e634-4a3b-8406-3b3b72007c2b","Type":"ContainerStarted","Data":"a46dfba4c1bfdcaa84310b4dfca31570004689699442a739dc16eb0acc8bbf84"} Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.375623 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67df497849-l9zzv" podStartSLOduration=8.375608501 podStartE2EDuration="8.375608501s" podCreationTimestamp="2025-12-11 05:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:32.37191405 +0000 UTC m=+1054.789260758" watchObservedRunningTime="2025-12-11 05:32:32.375608501 +0000 UTC m=+1054.792955199" Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.399698 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7989644c86-scmh4" podStartSLOduration=37.779003068 podStartE2EDuration="38.399682014s" podCreationTimestamp="2025-12-11 05:31:54 +0000 UTC" firstStartedPulling="2025-12-11 05:32:25.170768888 +0000 UTC m=+1047.588115586" lastFinishedPulling="2025-12-11 05:32:25.791447834 +0000 UTC m=+1048.208794532" observedRunningTime="2025-12-11 05:32:32.39325465 +0000 UTC m=+1054.810601348" watchObservedRunningTime="2025-12-11 05:32:32.399682014 +0000 UTC m=+1054.817028712" Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.447132 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=30.447111682 podStartE2EDuration="30.447111682s" podCreationTimestamp="2025-12-11 05:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:32.43781245 +0000 UTC m=+1054.855159148" watchObservedRunningTime="2025-12-11 05:32:32.447111682 +0000 UTC m=+1054.864458380" Dec 11 05:32:32 crc kubenswrapper[4628]: I1211 05:32:32.504650 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=30.504633935 podStartE2EDuration="30.504633935s" podCreationTimestamp="2025-12-11 05:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:32.467697182 +0000 UTC m=+1054.885043890" watchObservedRunningTime="2025-12-11 05:32:32.504633935 +0000 UTC m=+1054.921980633" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.095157 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.170990 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265009 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-logs\") pod \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265057 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb5497e-604f-4176-b214-0343c94e89c2-logs\") pod \"6bb5497e-604f-4176-b214-0343c94e89c2\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265082 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbbp9\" (UniqueName: \"kubernetes.io/projected/6bb5497e-604f-4176-b214-0343c94e89c2-kube-api-access-pbbp9\") pod \"6bb5497e-604f-4176-b214-0343c94e89c2\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265117 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-combined-ca-bundle\") pod \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265148 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bb5497e-604f-4176-b214-0343c94e89c2-httpd-run\") pod \"6bb5497e-604f-4176-b214-0343c94e89c2\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265164 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-config-data\") pod \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265212 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jfht\" (UniqueName: \"kubernetes.io/projected/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-kube-api-access-6jfht\") pod \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265283 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-scripts\") pod \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265302 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"6bb5497e-604f-4176-b214-0343c94e89c2\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265331 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-combined-ca-bundle\") pod \"6bb5497e-604f-4176-b214-0343c94e89c2\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265369 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-config-data\") pod \"6bb5497e-604f-4176-b214-0343c94e89c2\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265425 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265452 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-httpd-run\") pod \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\" (UID: \"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.265469 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-scripts\") pod \"6bb5497e-604f-4176-b214-0343c94e89c2\" (UID: \"6bb5497e-604f-4176-b214-0343c94e89c2\") " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.266609 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bb5497e-604f-4176-b214-0343c94e89c2-logs" (OuterVolumeSpecName: "logs") pod "6bb5497e-604f-4176-b214-0343c94e89c2" (UID: "6bb5497e-604f-4176-b214-0343c94e89c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.266707 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bb5497e-604f-4176-b214-0343c94e89c2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6bb5497e-604f-4176-b214-0343c94e89c2" (UID: "6bb5497e-604f-4176-b214-0343c94e89c2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.266839 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-logs" (OuterVolumeSpecName: "logs") pod "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" (UID: "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.267053 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" (UID: "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.278621 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb5497e-604f-4176-b214-0343c94e89c2-kube-api-access-pbbp9" (OuterVolumeSpecName: "kube-api-access-pbbp9") pod "6bb5497e-604f-4176-b214-0343c94e89c2" (UID: "6bb5497e-604f-4176-b214-0343c94e89c2"). InnerVolumeSpecName "kube-api-access-pbbp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.280136 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-scripts" (OuterVolumeSpecName: "scripts") pod "6bb5497e-604f-4176-b214-0343c94e89c2" (UID: "6bb5497e-604f-4176-b214-0343c94e89c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.284130 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "6bb5497e-604f-4176-b214-0343c94e89c2" (UID: "6bb5497e-604f-4176-b214-0343c94e89c2"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.285964 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-scripts" (OuterVolumeSpecName: "scripts") pod "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" (UID: "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.286083 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-kube-api-access-6jfht" (OuterVolumeSpecName: "kube-api-access-6jfht") pod "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" (UID: "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d"). InnerVolumeSpecName "kube-api-access-6jfht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.303069 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" (UID: "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.306290 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" (UID: "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.321268 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bb5497e-604f-4176-b214-0343c94e89c2" (UID: "6bb5497e-604f-4176-b214-0343c94e89c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.332474 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-config-data" (OuterVolumeSpecName: "config-data") pod "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" (UID: "90cbd5ab-58ac-49cf-ba49-e8bf84ec425d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.353490 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-config-data" (OuterVolumeSpecName: "config-data") pod "6bb5497e-604f-4176-b214-0343c94e89c2" (UID: "6bb5497e-604f-4176-b214-0343c94e89c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.366862 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.366900 4628 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.366914 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.366924 4628 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.366933 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.366941 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb5497e-604f-4176-b214-0343c94e89c2-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.366949 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbbp9\" (UniqueName: \"kubernetes.io/projected/6bb5497e-604f-4176-b214-0343c94e89c2-kube-api-access-pbbp9\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.366959 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.366967 4628 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6bb5497e-604f-4176-b214-0343c94e89c2-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.366974 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.366982 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jfht\" (UniqueName: \"kubernetes.io/projected/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-kube-api-access-6jfht\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.366990 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.367005 4628 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.367027 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb5497e-604f-4176-b214-0343c94e89c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.380332 4628 generic.go:334] "Generic (PLEG): container finished" podID="6bb5497e-604f-4176-b214-0343c94e89c2" containerID="f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae" exitCode=143 Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.380363 4628 generic.go:334] "Generic (PLEG): container finished" podID="6bb5497e-604f-4176-b214-0343c94e89c2" containerID="f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413" exitCode=143 Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.380393 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6bb5497e-604f-4176-b214-0343c94e89c2","Type":"ContainerDied","Data":"f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae"} Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.380417 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6bb5497e-604f-4176-b214-0343c94e89c2","Type":"ContainerDied","Data":"f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413"} Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.380427 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6bb5497e-604f-4176-b214-0343c94e89c2","Type":"ContainerDied","Data":"7bf592c9ccc209f257b6f3d72ad73c3c4657708898f06ea549c9f3c4b7193918"} Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.380442 4628 scope.go:117] "RemoveContainer" containerID="f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.380551 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.386083 4628 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.389304 4628 generic.go:334] "Generic (PLEG): container finished" podID="90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" containerID="0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299" exitCode=0 Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.389331 4628 generic.go:334] "Generic (PLEG): container finished" podID="90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" containerID="3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808" exitCode=143 Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.389969 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.390107 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d","Type":"ContainerDied","Data":"0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299"} Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.390132 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d","Type":"ContainerDied","Data":"3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808"} Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.390142 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"90cbd5ab-58ac-49cf-ba49-e8bf84ec425d","Type":"ContainerDied","Data":"971e59600f1d10db94f5b53d2b32a37607ccc3e4e4b24ca3f402107cb66f5526"} Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.412283 4628 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.426180 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.433346 4628 scope.go:117] "RemoveContainer" containerID="f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.446352 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.459304 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.465537 4628 scope.go:117] "RemoveContainer" containerID="f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.469008 4628 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.469034 4628 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:33 crc kubenswrapper[4628]: E1211 05:32:33.469073 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae\": container with ID starting with f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae not found: ID does not exist" containerID="f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.469097 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae"} err="failed to get container status \"f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae\": rpc error: code = NotFound desc = could not find container \"f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae\": container with ID starting with f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae not found: ID does not exist" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.469120 4628 scope.go:117] "RemoveContainer" containerID="f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.469205 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.474940 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:32:33 crc kubenswrapper[4628]: E1211 05:32:33.475309 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" containerName="glance-httpd" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.475320 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" containerName="glance-httpd" Dec 11 05:32:33 crc kubenswrapper[4628]: E1211 05:32:33.475338 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb5497e-604f-4176-b214-0343c94e89c2" containerName="glance-httpd" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.475344 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb5497e-604f-4176-b214-0343c94e89c2" containerName="glance-httpd" Dec 11 05:32:33 crc kubenswrapper[4628]: E1211 05:32:33.475363 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" containerName="glance-log" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.475370 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" containerName="glance-log" Dec 11 05:32:33 crc kubenswrapper[4628]: E1211 05:32:33.475379 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a" containerName="init" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.475386 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a" containerName="init" Dec 11 05:32:33 crc kubenswrapper[4628]: E1211 05:32:33.475406 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb5497e-604f-4176-b214-0343c94e89c2" containerName="glance-log" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.475412 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb5497e-604f-4176-b214-0343c94e89c2" containerName="glance-log" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.475565 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" containerName="glance-httpd" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.475578 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" containerName="glance-log" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.475588 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb5497e-604f-4176-b214-0343c94e89c2" containerName="glance-log" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.475597 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ce84fb-3b4a-4320-bb3a-5c66ce7eb43a" containerName="init" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.475612 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb5497e-604f-4176-b214-0343c94e89c2" containerName="glance-httpd" Dec 11 05:32:33 crc kubenswrapper[4628]: E1211 05:32:33.476095 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413\": container with ID starting with f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413 not found: ID does not exist" containerID="f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.476135 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413"} err="failed to get container status \"f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413\": rpc error: code = NotFound desc = could not find container \"f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413\": container with ID starting with f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413 not found: ID does not exist" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.476164 4628 scope.go:117] "RemoveContainer" containerID="f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.476477 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.478046 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae"} err="failed to get container status \"f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae\": rpc error: code = NotFound desc = could not find container \"f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae\": container with ID starting with f4f82613c22c4813a3280e38f2d94a97d5de250af67a761d4df1f379436e8eae not found: ID does not exist" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.478064 4628 scope.go:117] "RemoveContainer" containerID="f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.479884 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.480073 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.480966 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413"} err="failed to get container status \"f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413\": rpc error: code = NotFound desc = could not find container \"f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413\": container with ID starting with f365f5fd0a95c264ec80dd6b9c2154d47fabb37948e3ee5415c7f51abf244413 not found: ID does not exist" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.481003 4628 scope.go:117] "RemoveContainer" containerID="0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.481019 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.481099 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.480194 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-92zmr" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.487284 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.488580 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.502226 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.503278 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.536433 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.579763 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.579818 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.579901 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td4nq\" (UniqueName: \"kubernetes.io/projected/2696c26e-6fad-43c9-975f-f73149e0466d-kube-api-access-td4nq\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.579921 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2696c26e-6fad-43c9-975f-f73149e0466d-logs\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.579941 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.579976 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb26f327-99d0-4eb1-8c92-d36b17068b04-logs\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.579994 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vhrr\" (UniqueName: \"kubernetes.io/projected/eb26f327-99d0-4eb1-8c92-d36b17068b04-kube-api-access-5vhrr\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.580021 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.580053 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.580211 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2696c26e-6fad-43c9-975f-f73149e0466d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.580284 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.580308 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb26f327-99d0-4eb1-8c92-d36b17068b04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.580348 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.580386 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.580414 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.581258 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.609382 4628 scope.go:117] "RemoveContainer" containerID="3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.657684 4628 scope.go:117] "RemoveContainer" containerID="0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299" Dec 11 05:32:33 crc kubenswrapper[4628]: E1211 05:32:33.658245 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299\": container with ID starting with 0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299 not found: ID does not exist" containerID="0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.658276 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299"} err="failed to get container status \"0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299\": rpc error: code = NotFound desc = could not find container \"0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299\": container with ID starting with 0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299 not found: ID does not exist" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.658297 4628 scope.go:117] "RemoveContainer" containerID="3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808" Dec 11 05:32:33 crc kubenswrapper[4628]: E1211 05:32:33.658769 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808\": container with ID starting with 3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808 not found: ID does not exist" containerID="3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.658790 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808"} err="failed to get container status \"3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808\": rpc error: code = NotFound desc = could not find container \"3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808\": container with ID starting with 3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808 not found: ID does not exist" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.658805 4628 scope.go:117] "RemoveContainer" containerID="0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.659653 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299"} err="failed to get container status \"0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299\": rpc error: code = NotFound desc = could not find container \"0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299\": container with ID starting with 0d855a506749b754536bddbfc0567614e4acd53f12def2d8a4920b4050dde299 not found: ID does not exist" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.659704 4628 scope.go:117] "RemoveContainer" containerID="3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.660361 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808"} err="failed to get container status \"3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808\": rpc error: code = NotFound desc = could not find container \"3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808\": container with ID starting with 3b0637cc93fed905c277c7d1aeaa6f5eb797cd36c4c7ecd036dda07f81d0d808 not found: ID does not exist" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682437 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682487 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682517 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682534 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682553 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682572 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682601 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td4nq\" (UniqueName: \"kubernetes.io/projected/2696c26e-6fad-43c9-975f-f73149e0466d-kube-api-access-td4nq\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682617 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2696c26e-6fad-43c9-975f-f73149e0466d-logs\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682631 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682660 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb26f327-99d0-4eb1-8c92-d36b17068b04-logs\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682680 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vhrr\" (UniqueName: \"kubernetes.io/projected/eb26f327-99d0-4eb1-8c92-d36b17068b04-kube-api-access-5vhrr\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682697 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682720 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682770 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2696c26e-6fad-43c9-975f-f73149e0466d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682805 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.682860 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb26f327-99d0-4eb1-8c92-d36b17068b04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.683373 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb26f327-99d0-4eb1-8c92-d36b17068b04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.683711 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2696c26e-6fad-43c9-975f-f73149e0466d-logs\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.683912 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.689485 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.690226 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2696c26e-6fad-43c9-975f-f73149e0466d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.690481 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb26f327-99d0-4eb1-8c92-d36b17068b04-logs\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.690589 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.714768 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.715946 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.719032 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.719106 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.719504 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.719943 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.721826 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.722273 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td4nq\" (UniqueName: \"kubernetes.io/projected/2696c26e-6fad-43c9-975f-f73149e0466d-kube-api-access-td4nq\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.752476 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vhrr\" (UniqueName: \"kubernetes.io/projected/eb26f327-99d0-4eb1-8c92-d36b17068b04-kube-api-access-5vhrr\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.752700 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.763995 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.881355 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.900130 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bb5497e-604f-4176-b214-0343c94e89c2" path="/var/lib/kubelet/pods/6bb5497e-604f-4176-b214-0343c94e89c2/volumes" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.900969 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90cbd5ab-58ac-49cf-ba49-e8bf84ec425d" path="/var/lib/kubelet/pods/90cbd5ab-58ac-49cf-ba49-e8bf84ec425d/volumes" Dec 11 05:32:33 crc kubenswrapper[4628]: I1211 05:32:33.927591 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:34 crc kubenswrapper[4628]: I1211 05:32:34.403036 4628 generic.go:334] "Generic (PLEG): container finished" podID="c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c" containerID="4d9ea5e68dc77d1e29479862cc468b74fd29f30f0baa8471daf7986a7f91870f" exitCode=0 Dec 11 05:32:34 crc kubenswrapper[4628]: I1211 05:32:34.403136 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8b969" event={"ID":"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c","Type":"ContainerDied","Data":"4d9ea5e68dc77d1e29479862cc468b74fd29f30f0baa8471daf7986a7f91870f"} Dec 11 05:32:34 crc kubenswrapper[4628]: I1211 05:32:34.413014 4628 generic.go:334] "Generic (PLEG): container finished" podID="90c5df18-e257-4561-8148-8cebd4644e40" containerID="2f25c4c977c6a6e912a4fa4df7d2d12e1e16464d0b916865215eb5faffa036fe" exitCode=0 Dec 11 05:32:34 crc kubenswrapper[4628]: I1211 05:32:34.413110 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-79fmw" event={"ID":"90c5df18-e257-4561-8148-8cebd4644e40","Type":"ContainerDied","Data":"2f25c4c977c6a6e912a4fa4df7d2d12e1e16464d0b916865215eb5faffa036fe"} Dec 11 05:32:34 crc kubenswrapper[4628]: I1211 05:32:34.581997 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:32:34 crc kubenswrapper[4628]: I1211 05:32:34.680164 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:32:34 crc kubenswrapper[4628]: W1211 05:32:34.709657 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2696c26e_6fad_43c9_975f_f73149e0466d.slice/crio-dcec039349918899a7f6d8ee781f4d31e990784f3e5c12aab265a55e47971219 WatchSource:0}: Error finding container dcec039349918899a7f6d8ee781f4d31e990784f3e5c12aab265a55e47971219: Status 404 returned error can't find the container with id dcec039349918899a7f6d8ee781f4d31e990784f3e5c12aab265a55e47971219 Dec 11 05:32:35 crc kubenswrapper[4628]: I1211 05:32:35.093923 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:32:35 crc kubenswrapper[4628]: I1211 05:32:35.094008 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:32:35 crc kubenswrapper[4628]: I1211 05:32:35.096603 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:32:35 crc kubenswrapper[4628]: I1211 05:32:35.096652 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:32:35 crc kubenswrapper[4628]: I1211 05:32:35.447821 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb26f327-99d0-4eb1-8c92-d36b17068b04","Type":"ContainerStarted","Data":"948d42f7ce8fd2230a104db328e13fee4e18b517395b7097371dd1f6a2e67d19"} Dec 11 05:32:35 crc kubenswrapper[4628]: I1211 05:32:35.453989 4628 generic.go:334] "Generic (PLEG): container finished" podID="0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5" containerID="f36432b45933f8a8ac33dbaa563e189156370f2eaac211d62b706a9225ef3346" exitCode=0 Dec 11 05:32:35 crc kubenswrapper[4628]: I1211 05:32:35.454063 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nd79k" event={"ID":"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5","Type":"ContainerDied","Data":"f36432b45933f8a8ac33dbaa563e189156370f2eaac211d62b706a9225ef3346"} Dec 11 05:32:35 crc kubenswrapper[4628]: I1211 05:32:35.472941 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2696c26e-6fad-43c9-975f-f73149e0466d","Type":"ContainerStarted","Data":"dcec039349918899a7f6d8ee781f4d31e990784f3e5c12aab265a55e47971219"} Dec 11 05:32:36 crc kubenswrapper[4628]: I1211 05:32:36.481733 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2696c26e-6fad-43c9-975f-f73149e0466d","Type":"ContainerStarted","Data":"f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224"} Dec 11 05:32:36 crc kubenswrapper[4628]: I1211 05:32:36.483436 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb26f327-99d0-4eb1-8c92-d36b17068b04","Type":"ContainerStarted","Data":"d3ae4e692c01a7524ebae194f4f97f400110bc0aacbf136003e7360ed0404a03"} Dec 11 05:32:37 crc kubenswrapper[4628]: I1211 05:32:37.513099 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:32:37 crc kubenswrapper[4628]: I1211 05:32:37.582011 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-dlqkw"] Dec 11 05:32:37 crc kubenswrapper[4628]: I1211 05:32:37.582242 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" podUID="061f7965-d09e-4f01-9ee8-06638befdf0c" containerName="dnsmasq-dns" containerID="cri-o://f9f5853e3e8d41dccf4b7dd2da5f88496fb808528fa4f3aeac1360f1a200d080" gracePeriod=10 Dec 11 05:32:38 crc kubenswrapper[4628]: I1211 05:32:38.906908 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nd79k" Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.014316 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-db-sync-config-data\") pod \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\" (UID: \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\") " Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.014492 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-combined-ca-bundle\") pod \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\" (UID: \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\") " Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.014625 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2wpr\" (UniqueName: \"kubernetes.io/projected/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-kube-api-access-p2wpr\") pod \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\" (UID: \"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5\") " Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.020294 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5" (UID: "0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.032528 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-kube-api-access-p2wpr" (OuterVolumeSpecName: "kube-api-access-p2wpr") pod "0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5" (UID: "0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5"). InnerVolumeSpecName "kube-api-access-p2wpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.077953 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5" (UID: "0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.117073 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2wpr\" (UniqueName: \"kubernetes.io/projected/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-kube-api-access-p2wpr\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.117107 4628 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.117116 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.516696 4628 generic.go:334] "Generic (PLEG): container finished" podID="061f7965-d09e-4f01-9ee8-06638befdf0c" containerID="f9f5853e3e8d41dccf4b7dd2da5f88496fb808528fa4f3aeac1360f1a200d080" exitCode=0 Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.516773 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" event={"ID":"061f7965-d09e-4f01-9ee8-06638befdf0c","Type":"ContainerDied","Data":"f9f5853e3e8d41dccf4b7dd2da5f88496fb808528fa4f3aeac1360f1a200d080"} Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.519464 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nd79k" event={"ID":"0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5","Type":"ContainerDied","Data":"accaf55d0501467fb144c105fc0ce3f216ea988a05939b7ce52f0c835b7ba4f6"} Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.519488 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="accaf55d0501467fb144c105fc0ce3f216ea988a05939b7ce52f0c835b7ba4f6" Dec 11 05:32:39 crc kubenswrapper[4628]: I1211 05:32:39.519544 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nd79k" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.099506 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-759d9b665f-6pnnw"] Dec 11 05:32:40 crc kubenswrapper[4628]: E1211 05:32:40.111744 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5" containerName="barbican-db-sync" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.111771 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5" containerName="barbican-db-sync" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.112047 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5" containerName="barbican-db-sync" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.113184 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.120324 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.120746 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bh5kp" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.134241 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-759d9b665f-6pnnw"] Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.146175 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.235892 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b223e08d-3dfd-4c2d-b720-fe142822a27c-config-data\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.235993 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b223e08d-3dfd-4c2d-b720-fe142822a27c-combined-ca-bundle\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.236039 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b223e08d-3dfd-4c2d-b720-fe142822a27c-logs\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.236216 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5c9s\" (UniqueName: \"kubernetes.io/projected/b223e08d-3dfd-4c2d-b720-fe142822a27c-kube-api-access-w5c9s\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.236255 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b223e08d-3dfd-4c2d-b720-fe142822a27c-config-data-custom\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.273688 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-b5c776c64-wmwpw"] Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.275238 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.277661 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.283259 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-t6ntr"] Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.284681 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.309214 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-t6ntr"] Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.338248 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5c9s\" (UniqueName: \"kubernetes.io/projected/b223e08d-3dfd-4c2d-b720-fe142822a27c-kube-api-access-w5c9s\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.338292 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b223e08d-3dfd-4c2d-b720-fe142822a27c-config-data-custom\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.338331 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b223e08d-3dfd-4c2d-b720-fe142822a27c-config-data\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.338412 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-config-data\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.338432 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b223e08d-3dfd-4c2d-b720-fe142822a27c-combined-ca-bundle\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.338448 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-config-data-custom\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.338468 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-combined-ca-bundle\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.338500 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b223e08d-3dfd-4c2d-b720-fe142822a27c-logs\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.338523 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-logs\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.338543 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4njfz\" (UniqueName: \"kubernetes.io/projected/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-kube-api-access-4njfz\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.343504 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b223e08d-3dfd-4c2d-b720-fe142822a27c-logs\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.358662 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b223e08d-3dfd-4c2d-b720-fe142822a27c-combined-ca-bundle\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.368722 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b223e08d-3dfd-4c2d-b720-fe142822a27c-config-data-custom\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.373306 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5c9s\" (UniqueName: \"kubernetes.io/projected/b223e08d-3dfd-4c2d-b720-fe142822a27c-kube-api-access-w5c9s\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.374794 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b223e08d-3dfd-4c2d-b720-fe142822a27c-config-data\") pod \"barbican-worker-759d9b665f-6pnnw\" (UID: \"b223e08d-3dfd-4c2d-b720-fe142822a27c\") " pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.410590 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-b5c776c64-wmwpw"] Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.435280 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-759d9b665f-6pnnw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.436805 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-75dfc9988b-75mkw"] Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.440957 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-config\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.441014 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.441039 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-dns-svc\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.441067 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.441090 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmlf2\" (UniqueName: \"kubernetes.io/projected/8161e0eb-57ee-447e-9427-2e93432ff767-kube-api-access-hmlf2\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.441216 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-config-data\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.441244 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-config-data-custom\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.441270 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.441286 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-combined-ca-bundle\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.441337 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-logs\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.441358 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4njfz\" (UniqueName: \"kubernetes.io/projected/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-kube-api-access-4njfz\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.443124 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.444781 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-logs\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.447274 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.458575 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-combined-ca-bundle\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.465428 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-config-data\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.472189 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4njfz\" (UniqueName: \"kubernetes.io/projected/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-kube-api-access-4njfz\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.493382 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/391ad8e5-9c8c-463c-8d25-4d74e3f8cf94-config-data-custom\") pod \"barbican-keystone-listener-b5c776c64-wmwpw\" (UID: \"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94\") " pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.521951 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75dfc9988b-75mkw"] Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.542599 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-config\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.542660 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.542683 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-dns-svc\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.542706 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.542728 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmlf2\" (UniqueName: \"kubernetes.io/projected/8161e0eb-57ee-447e-9427-2e93432ff767-kube-api-access-hmlf2\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.542761 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8557\" (UniqueName: \"kubernetes.io/projected/941752a7-2f91-4e06-97a4-3f47417006f8-kube-api-access-z8557\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.542809 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.542827 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-combined-ca-bundle\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.542859 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-config-data\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.542893 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-config-data-custom\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.542908 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/941752a7-2f91-4e06-97a4-3f47417006f8-logs\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.543687 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-config\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.544190 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.544675 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-dns-svc\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.545088 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.545622 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.574816 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmlf2\" (UniqueName: \"kubernetes.io/projected/8161e0eb-57ee-447e-9427-2e93432ff767-kube-api-access-hmlf2\") pod \"dnsmasq-dns-85ff748b95-t6ntr\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.607786 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.621174 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.644736 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-config-data-custom\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.644776 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/941752a7-2f91-4e06-97a4-3f47417006f8-logs\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.644900 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8557\" (UniqueName: \"kubernetes.io/projected/941752a7-2f91-4e06-97a4-3f47417006f8-kube-api-access-z8557\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.644952 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-combined-ca-bundle\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.644971 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-config-data\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.645301 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/941752a7-2f91-4e06-97a4-3f47417006f8-logs\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.648929 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-combined-ca-bundle\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.651172 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-config-data\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.658377 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-config-data-custom\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.661300 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8557\" (UniqueName: \"kubernetes.io/projected/941752a7-2f91-4e06-97a4-3f47417006f8-kube-api-access-z8557\") pod \"barbican-api-75dfc9988b-75mkw\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.810012 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" podUID="061f7965-d09e-4f01-9ee8-06638befdf0c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: connect: connection refused" Dec 11 05:32:40 crc kubenswrapper[4628]: I1211 05:32:40.863438 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.603313 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8b969" event={"ID":"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c","Type":"ContainerDied","Data":"f88493b1448a1c9f3a4c7bf996440c9d4e4427b134c06d4039b68f3a12e9a629"} Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.603617 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f88493b1448a1c9f3a4c7bf996440c9d4e4427b134c06d4039b68f3a12e9a629" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.675935 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-79fmw" event={"ID":"90c5df18-e257-4561-8148-8cebd4644e40","Type":"ContainerDied","Data":"76dbd1452f45a8acdca8a7fe11bd6a62b923ed6fa51e9b2c247c72f1db2d91fb"} Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.675970 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76dbd1452f45a8acdca8a7fe11bd6a62b923ed6fa51e9b2c247c72f1db2d91fb" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.690161 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8b969" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.743596 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.795517 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-credential-keys\") pod \"90c5df18-e257-4561-8148-8cebd4644e40\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.795594 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-scripts\") pod \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.795662 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9jvz\" (UniqueName: \"kubernetes.io/projected/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-kube-api-access-h9jvz\") pod \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.795786 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-scripts\") pod \"90c5df18-e257-4561-8148-8cebd4644e40\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.795805 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-combined-ca-bundle\") pod \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.795836 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-logs\") pod \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.795978 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-config-data\") pod \"90c5df18-e257-4561-8148-8cebd4644e40\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.796003 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mrvg\" (UniqueName: \"kubernetes.io/projected/90c5df18-e257-4561-8148-8cebd4644e40-kube-api-access-6mrvg\") pod \"90c5df18-e257-4561-8148-8cebd4644e40\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.796054 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-fernet-keys\") pod \"90c5df18-e257-4561-8148-8cebd4644e40\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.796088 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-combined-ca-bundle\") pod \"90c5df18-e257-4561-8148-8cebd4644e40\" (UID: \"90c5df18-e257-4561-8148-8cebd4644e40\") " Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.796142 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-config-data\") pod \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\" (UID: \"c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c\") " Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.810652 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-scripts" (OuterVolumeSpecName: "scripts") pod "90c5df18-e257-4561-8148-8cebd4644e40" (UID: "90c5df18-e257-4561-8148-8cebd4644e40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.823166 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-logs" (OuterVolumeSpecName: "logs") pod "c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c" (UID: "c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.852152 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-scripts" (OuterVolumeSpecName: "scripts") pod "c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c" (UID: "c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.862281 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-kube-api-access-h9jvz" (OuterVolumeSpecName: "kube-api-access-h9jvz") pod "c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c" (UID: "c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c"). InnerVolumeSpecName "kube-api-access-h9jvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.900952 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.900981 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9jvz\" (UniqueName: \"kubernetes.io/projected/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-kube-api-access-h9jvz\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.900992 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.901002 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.957249 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "90c5df18-e257-4561-8148-8cebd4644e40" (UID: "90c5df18-e257-4561-8148-8cebd4644e40"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.957299 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "90c5df18-e257-4561-8148-8cebd4644e40" (UID: "90c5df18-e257-4561-8148-8cebd4644e40"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.957379 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90c5df18-e257-4561-8148-8cebd4644e40-kube-api-access-6mrvg" (OuterVolumeSpecName: "kube-api-access-6mrvg") pod "90c5df18-e257-4561-8148-8cebd4644e40" (UID: "90c5df18-e257-4561-8148-8cebd4644e40"). InnerVolumeSpecName "kube-api-access-6mrvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:42 crc kubenswrapper[4628]: I1211 05:32:42.992053 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-config-data" (OuterVolumeSpecName: "config-data") pod "c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c" (UID: "c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.002499 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mrvg\" (UniqueName: \"kubernetes.io/projected/90c5df18-e257-4561-8148-8cebd4644e40-kube-api-access-6mrvg\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.002520 4628 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.002529 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.002538 4628 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.028463 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c" (UID: "c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.037905 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-config-data" (OuterVolumeSpecName: "config-data") pod "90c5df18-e257-4561-8148-8cebd4644e40" (UID: "90c5df18-e257-4561-8148-8cebd4644e40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.066448 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90c5df18-e257-4561-8148-8cebd4644e40" (UID: "90c5df18-e257-4561-8148-8cebd4644e40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.085157 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.111376 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-dns-svc\") pod \"061f7965-d09e-4f01-9ee8-06638befdf0c\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.111418 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5m5m\" (UniqueName: \"kubernetes.io/projected/061f7965-d09e-4f01-9ee8-06638befdf0c-kube-api-access-b5m5m\") pod \"061f7965-d09e-4f01-9ee8-06638befdf0c\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.111476 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-ovsdbserver-sb\") pod \"061f7965-d09e-4f01-9ee8-06638befdf0c\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.111567 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-config\") pod \"061f7965-d09e-4f01-9ee8-06638befdf0c\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.111670 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-dns-swift-storage-0\") pod \"061f7965-d09e-4f01-9ee8-06638befdf0c\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.111703 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-ovsdbserver-nb\") pod \"061f7965-d09e-4f01-9ee8-06638befdf0c\" (UID: \"061f7965-d09e-4f01-9ee8-06638befdf0c\") " Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.112056 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.112067 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.112077 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90c5df18-e257-4561-8148-8cebd4644e40-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.156312 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/061f7965-d09e-4f01-9ee8-06638befdf0c-kube-api-access-b5m5m" (OuterVolumeSpecName: "kube-api-access-b5m5m") pod "061f7965-d09e-4f01-9ee8-06638befdf0c" (UID: "061f7965-d09e-4f01-9ee8-06638befdf0c"). InnerVolumeSpecName "kube-api-access-b5m5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.213638 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5m5m\" (UniqueName: \"kubernetes.io/projected/061f7965-d09e-4f01-9ee8-06638befdf0c-kube-api-access-b5m5m\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.258534 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "061f7965-d09e-4f01-9ee8-06638befdf0c" (UID: "061f7965-d09e-4f01-9ee8-06638befdf0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.266662 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "061f7965-d09e-4f01-9ee8-06638befdf0c" (UID: "061f7965-d09e-4f01-9ee8-06638befdf0c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.283976 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-759d9b665f-6pnnw"] Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.305050 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75dfc9988b-75mkw"] Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.311103 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-config" (OuterVolumeSpecName: "config") pod "061f7965-d09e-4f01-9ee8-06638befdf0c" (UID: "061f7965-d09e-4f01-9ee8-06638befdf0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.316700 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "061f7965-d09e-4f01-9ee8-06638befdf0c" (UID: "061f7965-d09e-4f01-9ee8-06638befdf0c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.317880 4628 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.317896 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.317905 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.317913 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.318116 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "061f7965-d09e-4f01-9ee8-06638befdf0c" (UID: "061f7965-d09e-4f01-9ee8-06638befdf0c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.422594 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/061f7965-d09e-4f01-9ee8-06638befdf0c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.491906 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-t6ntr"] Dec 11 05:32:43 crc kubenswrapper[4628]: W1211 05:32:43.505935 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8161e0eb_57ee_447e_9427_2e93432ff767.slice/crio-7b5eb76b80be93d2b0cd7987b7e5348afdb6fb137d7883099633f28f73444119 WatchSource:0}: Error finding container 7b5eb76b80be93d2b0cd7987b7e5348afdb6fb137d7883099633f28f73444119: Status 404 returned error can't find the container with id 7b5eb76b80be93d2b0cd7987b7e5348afdb6fb137d7883099633f28f73444119 Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.614343 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-b5c776c64-wmwpw"] Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.705497 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" event={"ID":"061f7965-d09e-4f01-9ee8-06638befdf0c","Type":"ContainerDied","Data":"61a097379bc9fc7062390f35f4827683f5fb1c639d424b3ec5577de8215e99a9"} Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.705547 4628 scope.go:117] "RemoveContainer" containerID="f9f5853e3e8d41dccf4b7dd2da5f88496fb808528fa4f3aeac1360f1a200d080" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.705660 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-dlqkw" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.714580 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-759d9b665f-6pnnw" event={"ID":"b223e08d-3dfd-4c2d-b720-fe142822a27c","Type":"ContainerStarted","Data":"37c1773a55147d971a2ce4defad004ca4ca1f0bcb60d55cbad5fcad0af36a8c2"} Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.729388 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75dfc9988b-75mkw" event={"ID":"941752a7-2f91-4e06-97a4-3f47417006f8","Type":"ContainerStarted","Data":"bd1ca41441ed2909f033a1fc363b5927e39487e612ebd1add956af235fa18f05"} Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.752488 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-dlqkw"] Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.766678 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-79fmw" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.769292 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" event={"ID":"8161e0eb-57ee-447e-9427-2e93432ff767","Type":"ContainerStarted","Data":"7b5eb76b80be93d2b0cd7987b7e5348afdb6fb137d7883099633f28f73444119"} Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.769406 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8b969" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.799390 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-dlqkw"] Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.835075 4628 scope.go:117] "RemoveContainer" containerID="714f134440ee888beb0da4181a1de1630acab07c653c5cc2dc91982416179464" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.850588 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-f64c5f7b6-ctrn9"] Dec 11 05:32:43 crc kubenswrapper[4628]: E1211 05:32:43.851071 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061f7965-d09e-4f01-9ee8-06638befdf0c" containerName="init" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.851087 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="061f7965-d09e-4f01-9ee8-06638befdf0c" containerName="init" Dec 11 05:32:43 crc kubenswrapper[4628]: E1211 05:32:43.851129 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c" containerName="placement-db-sync" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.851136 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c" containerName="placement-db-sync" Dec 11 05:32:43 crc kubenswrapper[4628]: E1211 05:32:43.851150 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90c5df18-e257-4561-8148-8cebd4644e40" containerName="keystone-bootstrap" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.851157 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c5df18-e257-4561-8148-8cebd4644e40" containerName="keystone-bootstrap" Dec 11 05:32:43 crc kubenswrapper[4628]: E1211 05:32:43.851164 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061f7965-d09e-4f01-9ee8-06638befdf0c" containerName="dnsmasq-dns" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.851170 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="061f7965-d09e-4f01-9ee8-06638befdf0c" containerName="dnsmasq-dns" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.851343 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c" containerName="placement-db-sync" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.851355 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="90c5df18-e257-4561-8148-8cebd4644e40" containerName="keystone-bootstrap" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.851368 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="061f7965-d09e-4f01-9ee8-06638befdf0c" containerName="dnsmasq-dns" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.863481 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.870901 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.876355 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.937594 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-public-tls-certs\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.937657 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b9644d-2578-4628-ac4c-28d16e0657e0-logs\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.937687 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-config-data-custom\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.937709 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-config-data\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.937823 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-combined-ca-bundle\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.937861 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2x9w\" (UniqueName: \"kubernetes.io/projected/b3b9644d-2578-4628-ac4c-28d16e0657e0-kube-api-access-h2x9w\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:43 crc kubenswrapper[4628]: I1211 05:32:43.937891 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-internal-tls-certs\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.045231 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-combined-ca-bundle\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.045800 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2x9w\" (UniqueName: \"kubernetes.io/projected/b3b9644d-2578-4628-ac4c-28d16e0657e0-kube-api-access-h2x9w\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.046106 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-internal-tls-certs\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.046333 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-public-tls-certs\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.046417 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b9644d-2578-4628-ac4c-28d16e0657e0-logs\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.046478 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-config-data-custom\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.046505 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-config-data\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.077629 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-combined-ca-bundle\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.078002 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-public-tls-certs\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.078860 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-internal-tls-certs\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.080420 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b9644d-2578-4628-ac4c-28d16e0657e0-logs\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.082304 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="061f7965-d09e-4f01-9ee8-06638befdf0c" path="/var/lib/kubelet/pods/061f7965-d09e-4f01-9ee8-06638befdf0c/volumes" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.088260 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-f64c5f7b6-ctrn9"] Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.088301 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55f69568c9-2p2zq"] Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.092596 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2x9w\" (UniqueName: \"kubernetes.io/projected/b3b9644d-2578-4628-ac4c-28d16e0657e0-kube-api-access-h2x9w\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.093949 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-config-data-custom\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.123530 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55f69568c9-2p2zq"] Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.123565 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-589776fd4-wpbmd"] Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.126935 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.129621 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.130627 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.133515 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-58h9w" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.135330 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.137024 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.137200 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.145267 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3b9644d-2578-4628-ac4c-28d16e0657e0-config-data\") pod \"barbican-api-f64c5f7b6-ctrn9\" (UID: \"b3b9644d-2578-4628-ac4c-28d16e0657e0\") " pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.158621 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.161473 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.162071 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.163490 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.164126 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ggglx" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.164431 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.207938 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-589776fd4-wpbmd"] Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.284806 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf66d5d0-5466-4c43-ab27-23c603bd90f7-logs\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.284889 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-config-data\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.284912 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-scripts\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.284955 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx9dj\" (UniqueName: \"kubernetes.io/projected/3b27f97f-7392-47aa-8551-badeb28bce06-kube-api-access-sx9dj\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.284976 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-combined-ca-bundle\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.284992 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-credential-keys\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.285012 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-public-tls-certs\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.285026 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqvdd\" (UniqueName: \"kubernetes.io/projected/bf66d5d0-5466-4c43-ab27-23c603bd90f7-kube-api-access-cqvdd\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.285040 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-config-data\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.285053 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-internal-tls-certs\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.285073 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-fernet-keys\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.285120 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-combined-ca-bundle\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.285138 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-public-tls-certs\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.285170 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-scripts\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.285204 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-internal-tls-certs\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.296573 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.386991 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-combined-ca-bundle\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387033 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-public-tls-certs\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387070 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-scripts\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387102 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-internal-tls-certs\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387121 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf66d5d0-5466-4c43-ab27-23c603bd90f7-logs\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387140 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-config-data\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387155 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-scripts\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387192 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx9dj\" (UniqueName: \"kubernetes.io/projected/3b27f97f-7392-47aa-8551-badeb28bce06-kube-api-access-sx9dj\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387210 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-combined-ca-bundle\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387227 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-credential-keys\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387252 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-public-tls-certs\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387268 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqvdd\" (UniqueName: \"kubernetes.io/projected/bf66d5d0-5466-4c43-ab27-23c603bd90f7-kube-api-access-cqvdd\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387281 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-internal-tls-certs\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387295 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-config-data\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.387318 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-fernet-keys\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.390141 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf66d5d0-5466-4c43-ab27-23c603bd90f7-logs\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.398518 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-combined-ca-bundle\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.398907 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-scripts\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.399245 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-internal-tls-certs\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.402987 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-config-data\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.403108 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-credential-keys\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.403385 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-fernet-keys\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.406424 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b27f97f-7392-47aa-8551-badeb28bce06-public-tls-certs\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.410161 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx9dj\" (UniqueName: \"kubernetes.io/projected/3b27f97f-7392-47aa-8551-badeb28bce06-kube-api-access-sx9dj\") pod \"keystone-55f69568c9-2p2zq\" (UID: \"3b27f97f-7392-47aa-8551-badeb28bce06\") " pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.412855 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-scripts\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.420138 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqvdd\" (UniqueName: \"kubernetes.io/projected/bf66d5d0-5466-4c43-ab27-23c603bd90f7-kube-api-access-cqvdd\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.437256 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-internal-tls-certs\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.437493 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-combined-ca-bundle\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.462459 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.467044 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-public-tls-certs\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.473070 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf66d5d0-5466-4c43-ab27-23c603bd90f7-config-data\") pod \"placement-589776fd4-wpbmd\" (UID: \"bf66d5d0-5466-4c43-ab27-23c603bd90f7\") " pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.496021 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.886039 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2696c26e-6fad-43c9-975f-f73149e0466d","Type":"ContainerStarted","Data":"518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e"} Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.910035 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" event={"ID":"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94","Type":"ContainerStarted","Data":"344a1cdb2b728f8f109c7801a37b3e7b5c6f44fea4514526decf6d88c9ebcdd5"} Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.928491 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75dfc9988b-75mkw" event={"ID":"941752a7-2f91-4e06-97a4-3f47417006f8","Type":"ContainerStarted","Data":"37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd"} Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.930379 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48b1c132-b854-4494-9e51-d934e9946366","Type":"ContainerStarted","Data":"2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130"} Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.930701 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.930683192 podStartE2EDuration="11.930683192s" podCreationTimestamp="2025-12-11 05:32:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:44.928234606 +0000 UTC m=+1067.345581304" watchObservedRunningTime="2025-12-11 05:32:44.930683192 +0000 UTC m=+1067.348029890" Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.940736 4628 generic.go:334] "Generic (PLEG): container finished" podID="8161e0eb-57ee-447e-9427-2e93432ff767" containerID="961c0d0a8476731fc6fbfeb66baeb96663d5d22a13200ab6e0d719330848b431" exitCode=0 Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.941581 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" event={"ID":"8161e0eb-57ee-447e-9427-2e93432ff767","Type":"ContainerDied","Data":"961c0d0a8476731fc6fbfeb66baeb96663d5d22a13200ab6e0d719330848b431"} Dec 11 05:32:44 crc kubenswrapper[4628]: I1211 05:32:44.978901 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-f64c5f7b6-ctrn9"] Dec 11 05:32:45 crc kubenswrapper[4628]: I1211 05:32:45.098477 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66bdd9d8cd-mgd96" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 11 05:32:45 crc kubenswrapper[4628]: I1211 05:32:45.100067 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7989644c86-scmh4" podUID="51e02694-e634-4a3b-8406-3b3b72007c2b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 11 05:32:45 crc kubenswrapper[4628]: I1211 05:32:45.590639 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55f69568c9-2p2zq"] Dec 11 05:32:45 crc kubenswrapper[4628]: I1211 05:32:45.925372 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-589776fd4-wpbmd"] Dec 11 05:32:45 crc kubenswrapper[4628]: W1211 05:32:45.939099 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf66d5d0_5466_4c43_ab27_23c603bd90f7.slice/crio-cb0bdf50ed1a60197ae73f963c6b8cf324d8ff598f4da203ebaa4063ffe46369 WatchSource:0}: Error finding container cb0bdf50ed1a60197ae73f963c6b8cf324d8ff598f4da203ebaa4063ffe46369: Status 404 returned error can't find the container with id cb0bdf50ed1a60197ae73f963c6b8cf324d8ff598f4da203ebaa4063ffe46369 Dec 11 05:32:45 crc kubenswrapper[4628]: I1211 05:32:45.958786 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55f69568c9-2p2zq" event={"ID":"3b27f97f-7392-47aa-8551-badeb28bce06","Type":"ContainerStarted","Data":"5b6333eb8af11afb29d1b2a924653aa9d3c73f67f5cdafad69c30bf47955c41e"} Dec 11 05:32:45 crc kubenswrapper[4628]: I1211 05:32:45.960332 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f64c5f7b6-ctrn9" event={"ID":"b3b9644d-2578-4628-ac4c-28d16e0657e0","Type":"ContainerStarted","Data":"6f1a9baafadd09341b335d41692b6ce8c9fac5b2ca0a21ba89eac8989b897af3"} Dec 11 05:32:45 crc kubenswrapper[4628]: I1211 05:32:45.962662 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-589776fd4-wpbmd" event={"ID":"bf66d5d0-5466-4c43-ab27-23c603bd90f7","Type":"ContainerStarted","Data":"cb0bdf50ed1a60197ae73f963c6b8cf324d8ff598f4da203ebaa4063ffe46369"} Dec 11 05:32:46 crc kubenswrapper[4628]: I1211 05:32:46.979261 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n2b6t" event={"ID":"38627c48-4a86-4721-874d-8f386ea24495","Type":"ContainerStarted","Data":"9129f9730171d9f262b1e72ab8699858bf1860bac87e69c40b0ca3c7700d1323"} Dec 11 05:32:46 crc kubenswrapper[4628]: I1211 05:32:46.992089 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f64c5f7b6-ctrn9" event={"ID":"b3b9644d-2578-4628-ac4c-28d16e0657e0","Type":"ContainerStarted","Data":"8c1fbcdde3d757c578445389e04d3ecb12c74aec742246cc839bd62d52e69676"} Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.000798 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-n2b6t" podStartSLOduration=6.192831038 podStartE2EDuration="1m3.000751322s" podCreationTimestamp="2025-12-11 05:31:44 +0000 UTC" firstStartedPulling="2025-12-11 05:31:46.628197917 +0000 UTC m=+1009.045544615" lastFinishedPulling="2025-12-11 05:32:43.436118211 +0000 UTC m=+1065.853464899" observedRunningTime="2025-12-11 05:32:46.997956286 +0000 UTC m=+1069.415302984" watchObservedRunningTime="2025-12-11 05:32:47.000751322 +0000 UTC m=+1069.418098020" Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.013358 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" event={"ID":"8161e0eb-57ee-447e-9427-2e93432ff767","Type":"ContainerStarted","Data":"c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef"} Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.013972 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.015748 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-589776fd4-wpbmd" event={"ID":"bf66d5d0-5466-4c43-ab27-23c603bd90f7","Type":"ContainerStarted","Data":"9e429021de49fac31b5643ea4a12b37fbbdbd2771c91bdf805db8a2072295cfe"} Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.032228 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55f69568c9-2p2zq" event={"ID":"3b27f97f-7392-47aa-8551-badeb28bce06","Type":"ContainerStarted","Data":"ad20134f818d54ddf33d80de08ec042c0a12fb6d83184de8a7a7cbae79daa0c7"} Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.032990 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.048617 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" podStartSLOduration=7.048596302 podStartE2EDuration="7.048596302s" podCreationTimestamp="2025-12-11 05:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:47.04819228 +0000 UTC m=+1069.465538978" watchObservedRunningTime="2025-12-11 05:32:47.048596302 +0000 UTC m=+1069.465943000" Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.054510 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb26f327-99d0-4eb1-8c92-d36b17068b04","Type":"ContainerStarted","Data":"c9777ab6617b13d26700daea25e423aaf1c9264d7da8fef44a14c84ff2fc9ddb"} Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.057323 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75dfc9988b-75mkw" event={"ID":"941752a7-2f91-4e06-97a4-3f47417006f8","Type":"ContainerStarted","Data":"c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e"} Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.057718 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.058139 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.079701 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55f69568c9-2p2zq" podStartSLOduration=4.079680406 podStartE2EDuration="4.079680406s" podCreationTimestamp="2025-12-11 05:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:47.06990045 +0000 UTC m=+1069.487247158" watchObservedRunningTime="2025-12-11 05:32:47.079680406 +0000 UTC m=+1069.497027104" Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.126759 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.126745474 podStartE2EDuration="14.126745474s" podCreationTimestamp="2025-12-11 05:32:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:47.123734932 +0000 UTC m=+1069.541081630" watchObservedRunningTime="2025-12-11 05:32:47.126745474 +0000 UTC m=+1069.544092172" Dec 11 05:32:47 crc kubenswrapper[4628]: I1211 05:32:47.152108 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-75dfc9988b-75mkw" podStartSLOduration=7.152089283 podStartE2EDuration="7.152089283s" podCreationTimestamp="2025-12-11 05:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:47.147414515 +0000 UTC m=+1069.564761213" watchObservedRunningTime="2025-12-11 05:32:47.152089283 +0000 UTC m=+1069.569435981" Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.113420 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-f64c5f7b6-ctrn9" event={"ID":"b3b9644d-2578-4628-ac4c-28d16e0657e0","Type":"ContainerStarted","Data":"f6666dec778144f1f41c455501d9cb66e312e00e8939cf306a753a2156b979a0"} Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.115152 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.115290 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.118872 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-759d9b665f-6pnnw" event={"ID":"b223e08d-3dfd-4c2d-b720-fe142822a27c","Type":"ContainerStarted","Data":"2657d408e3e2c089cdfca14291c86ad7a643aee3ac021101768bd9b5f03a9f55"} Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.118939 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-759d9b665f-6pnnw" event={"ID":"b223e08d-3dfd-4c2d-b720-fe142822a27c","Type":"ContainerStarted","Data":"09f7fd17ddd4bb25ffeef116fce316a33dcf66136d7ed93dfeabe6bf256a9bc3"} Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.120890 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-589776fd4-wpbmd" event={"ID":"bf66d5d0-5466-4c43-ab27-23c603bd90f7","Type":"ContainerStarted","Data":"97da49121ce8444bc2368b106159639e5ae18cdf33a6914c50db093a10189501"} Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.121691 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.121719 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.126269 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" event={"ID":"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94","Type":"ContainerStarted","Data":"ddab0fd5cf70be00fbabc6ba4418f72e8e4288a15c0446feed24ede6bfa89101"} Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.126313 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" event={"ID":"391ad8e5-9c8c-463c-8d25-4d74e3f8cf94","Type":"ContainerStarted","Data":"1b2867914cee933d9a22343aa6a6564b1f8a2b1f0f253bd14dc59a49fc3e05cf"} Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.141834 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-f64c5f7b6-ctrn9" podStartSLOduration=7.14181194 podStartE2EDuration="7.14181194s" podCreationTimestamp="2025-12-11 05:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:50.134988344 +0000 UTC m=+1072.552335042" watchObservedRunningTime="2025-12-11 05:32:50.14181194 +0000 UTC m=+1072.559158628" Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.149576 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-589776fd4-wpbmd" podStartSLOduration=7.149558589 podStartE2EDuration="7.149558589s" podCreationTimestamp="2025-12-11 05:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:32:50.149301293 +0000 UTC m=+1072.566647991" watchObservedRunningTime="2025-12-11 05:32:50.149558589 +0000 UTC m=+1072.566905287" Dec 11 05:32:50 crc kubenswrapper[4628]: I1211 05:32:50.181926 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-b5c776c64-wmwpw" podStartSLOduration=4.515124905 podStartE2EDuration="10.181907688s" podCreationTimestamp="2025-12-11 05:32:40 +0000 UTC" firstStartedPulling="2025-12-11 05:32:43.720137384 +0000 UTC m=+1066.137484082" lastFinishedPulling="2025-12-11 05:32:49.386920147 +0000 UTC m=+1071.804266865" observedRunningTime="2025-12-11 05:32:50.167129947 +0000 UTC m=+1072.584476645" watchObservedRunningTime="2025-12-11 05:32:50.181907688 +0000 UTC m=+1072.599254386" Dec 11 05:32:51 crc kubenswrapper[4628]: I1211 05:32:51.161485 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-759d9b665f-6pnnw" podStartSLOduration=5.237925127 podStartE2EDuration="11.161461712s" podCreationTimestamp="2025-12-11 05:32:40 +0000 UTC" firstStartedPulling="2025-12-11 05:32:43.463381552 +0000 UTC m=+1065.880728260" lastFinishedPulling="2025-12-11 05:32:49.386918147 +0000 UTC m=+1071.804264845" observedRunningTime="2025-12-11 05:32:51.160973599 +0000 UTC m=+1073.578320307" watchObservedRunningTime="2025-12-11 05:32:51.161461712 +0000 UTC m=+1073.578808410" Dec 11 05:32:52 crc kubenswrapper[4628]: I1211 05:32:52.505106 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:32:52 crc kubenswrapper[4628]: I1211 05:32:52.519005 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:53 crc kubenswrapper[4628]: I1211 05:32:53.560957 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:32:53 crc kubenswrapper[4628]: I1211 05:32:53.676364 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:53 crc kubenswrapper[4628]: I1211 05:32:53.882173 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 11 05:32:53 crc kubenswrapper[4628]: I1211 05:32:53.882230 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 11 05:32:53 crc kubenswrapper[4628]: I1211 05:32:53.925495 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 11 05:32:53 crc kubenswrapper[4628]: I1211 05:32:53.928280 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:53 crc kubenswrapper[4628]: I1211 05:32:53.928307 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:53 crc kubenswrapper[4628]: I1211 05:32:53.970345 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 11 05:32:54 crc kubenswrapper[4628]: I1211 05:32:54.048428 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:54 crc kubenswrapper[4628]: I1211 05:32:54.055548 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:54 crc kubenswrapper[4628]: I1211 05:32:54.175309 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:54 crc kubenswrapper[4628]: I1211 05:32:54.175557 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 11 05:32:54 crc kubenswrapper[4628]: I1211 05:32:54.176264 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 11 05:32:54 crc kubenswrapper[4628]: I1211 05:32:54.177298 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 11 05:32:55 crc kubenswrapper[4628]: I1211 05:32:55.095715 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66bdd9d8cd-mgd96" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 11 05:32:55 crc kubenswrapper[4628]: I1211 05:32:55.096389 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7989644c86-scmh4" podUID="51e02694-e634-4a3b-8406-3b3b72007c2b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 11 05:32:55 crc kubenswrapper[4628]: I1211 05:32:55.279198 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67df497849-l9zzv" Dec 11 05:32:55 crc kubenswrapper[4628]: I1211 05:32:55.446971 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:32:55 crc kubenswrapper[4628]: I1211 05:32:55.450619 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f4cbc8496-gqwsj"] Dec 11 05:32:55 crc kubenswrapper[4628]: I1211 05:32:55.450822 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f4cbc8496-gqwsj" podUID="32a1d94f-dcff-4648-8b3e-0b54cf493211" containerName="neutron-api" containerID="cri-o://397390f28ff14aee582539906e70c1bfcf7d07ac5cf0aa724a9d0e6e5033daab" gracePeriod=30 Dec 11 05:32:55 crc kubenswrapper[4628]: I1211 05:32:55.450973 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f4cbc8496-gqwsj" podUID="32a1d94f-dcff-4648-8b3e-0b54cf493211" containerName="neutron-httpd" containerID="cri-o://ce845a4eb1c022bffc53c8bb0332d1e2eeb8cd5361b82915aba90b6b40ff16f9" gracePeriod=30 Dec 11 05:32:55 crc kubenswrapper[4628]: I1211 05:32:55.623990 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:32:55 crc kubenswrapper[4628]: I1211 05:32:55.726248 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-lh2sk"] Dec 11 05:32:55 crc kubenswrapper[4628]: I1211 05:32:55.726465 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" podUID="36ef8b87-70d5-4803-a864-19cde1a04b87" containerName="dnsmasq-dns" containerID="cri-o://c844db229559a48791024563eb13063999c90e5c8aff7101cb52ccbaf156c0d6" gracePeriod=10 Dec 11 05:32:56 crc kubenswrapper[4628]: I1211 05:32:56.191838 4628 generic.go:334] "Generic (PLEG): container finished" podID="38627c48-4a86-4721-874d-8f386ea24495" containerID="9129f9730171d9f262b1e72ab8699858bf1860bac87e69c40b0ca3c7700d1323" exitCode=0 Dec 11 05:32:56 crc kubenswrapper[4628]: I1211 05:32:56.191927 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n2b6t" event={"ID":"38627c48-4a86-4721-874d-8f386ea24495","Type":"ContainerDied","Data":"9129f9730171d9f262b1e72ab8699858bf1860bac87e69c40b0ca3c7700d1323"} Dec 11 05:32:56 crc kubenswrapper[4628]: I1211 05:32:56.195174 4628 generic.go:334] "Generic (PLEG): container finished" podID="32a1d94f-dcff-4648-8b3e-0b54cf493211" containerID="ce845a4eb1c022bffc53c8bb0332d1e2eeb8cd5361b82915aba90b6b40ff16f9" exitCode=0 Dec 11 05:32:56 crc kubenswrapper[4628]: I1211 05:32:56.195300 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f4cbc8496-gqwsj" event={"ID":"32a1d94f-dcff-4648-8b3e-0b54cf493211","Type":"ContainerDied","Data":"ce845a4eb1c022bffc53c8bb0332d1e2eeb8cd5361b82915aba90b6b40ff16f9"} Dec 11 05:32:56 crc kubenswrapper[4628]: I1211 05:32:56.202600 4628 generic.go:334] "Generic (PLEG): container finished" podID="36ef8b87-70d5-4803-a864-19cde1a04b87" containerID="c844db229559a48791024563eb13063999c90e5c8aff7101cb52ccbaf156c0d6" exitCode=0 Dec 11 05:32:56 crc kubenswrapper[4628]: I1211 05:32:56.202743 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:32:56 crc kubenswrapper[4628]: I1211 05:32:56.202811 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:32:56 crc kubenswrapper[4628]: I1211 05:32:56.204739 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" event={"ID":"36ef8b87-70d5-4803-a864-19cde1a04b87","Type":"ContainerDied","Data":"c844db229559a48791024563eb13063999c90e5c8aff7101cb52ccbaf156c0d6"} Dec 11 05:32:57 crc kubenswrapper[4628]: I1211 05:32:57.514277 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" podUID="36ef8b87-70d5-4803-a864-19cde1a04b87" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: connect: connection refused" Dec 11 05:32:57 crc kubenswrapper[4628]: I1211 05:32:57.847926 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-f64c5f7b6-ctrn9" Dec 11 05:32:58 crc kubenswrapper[4628]: I1211 05:32:58.044051 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75dfc9988b-75mkw"] Dec 11 05:32:58 crc kubenswrapper[4628]: I1211 05:32:58.044433 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75dfc9988b-75mkw" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" containerName="barbican-api-log" containerID="cri-o://37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd" gracePeriod=30 Dec 11 05:32:58 crc kubenswrapper[4628]: I1211 05:32:58.044771 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75dfc9988b-75mkw" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" containerName="barbican-api" containerID="cri-o://c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e" gracePeriod=30 Dec 11 05:32:58 crc kubenswrapper[4628]: I1211 05:32:58.070439 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-75dfc9988b-75mkw" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Dec 11 05:32:58 crc kubenswrapper[4628]: I1211 05:32:58.070479 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75dfc9988b-75mkw" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Dec 11 05:32:58 crc kubenswrapper[4628]: I1211 05:32:58.233315 4628 generic.go:334] "Generic (PLEG): container finished" podID="941752a7-2f91-4e06-97a4-3f47417006f8" containerID="37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd" exitCode=143 Dec 11 05:32:58 crc kubenswrapper[4628]: I1211 05:32:58.233609 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75dfc9988b-75mkw" event={"ID":"941752a7-2f91-4e06-97a4-3f47417006f8","Type":"ContainerDied","Data":"37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd"} Dec 11 05:33:00 crc kubenswrapper[4628]: I1211 05:33:00.734405 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 11 05:33:00 crc kubenswrapper[4628]: I1211 05:33:00.734968 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:33:00 crc kubenswrapper[4628]: I1211 05:33:00.736733 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 11 05:33:00 crc kubenswrapper[4628]: I1211 05:33:00.737118 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:33:00 crc kubenswrapper[4628]: I1211 05:33:00.747240 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 11 05:33:00 crc kubenswrapper[4628]: I1211 05:33:00.770519 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 11 05:33:01 crc kubenswrapper[4628]: I1211 05:33:01.261559 4628 generic.go:334] "Generic (PLEG): container finished" podID="32a1d94f-dcff-4648-8b3e-0b54cf493211" containerID="397390f28ff14aee582539906e70c1bfcf7d07ac5cf0aa724a9d0e6e5033daab" exitCode=0 Dec 11 05:33:01 crc kubenswrapper[4628]: I1211 05:33:01.261643 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f4cbc8496-gqwsj" event={"ID":"32a1d94f-dcff-4648-8b3e-0b54cf493211","Type":"ContainerDied","Data":"397390f28ff14aee582539906e70c1bfcf7d07ac5cf0aa724a9d0e6e5033daab"} Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.513538 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" podUID="36ef8b87-70d5-4803-a864-19cde1a04b87" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: connect: connection refused" Dec 11 05:33:02 crc kubenswrapper[4628]: E1211 05:33:02.578644 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Dec 11 05:33:02 crc kubenswrapper[4628]: E1211 05:33:02.578814 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njt7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(48b1c132-b854-4494-9e51-d934e9946366): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 11 05:33:02 crc kubenswrapper[4628]: E1211 05:33:02.580425 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="48b1c132-b854-4494-9e51-d934e9946366" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.679425 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.736832 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-combined-ca-bundle\") pod \"38627c48-4a86-4721-874d-8f386ea24495\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.736913 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38627c48-4a86-4721-874d-8f386ea24495-etc-machine-id\") pod \"38627c48-4a86-4721-874d-8f386ea24495\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.736940 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-scripts\") pod \"38627c48-4a86-4721-874d-8f386ea24495\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.737025 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp4x5\" (UniqueName: \"kubernetes.io/projected/38627c48-4a86-4721-874d-8f386ea24495-kube-api-access-hp4x5\") pod \"38627c48-4a86-4721-874d-8f386ea24495\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.737103 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-config-data\") pod \"38627c48-4a86-4721-874d-8f386ea24495\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.737162 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-db-sync-config-data\") pod \"38627c48-4a86-4721-874d-8f386ea24495\" (UID: \"38627c48-4a86-4721-874d-8f386ea24495\") " Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.747569 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38627c48-4a86-4721-874d-8f386ea24495-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "38627c48-4a86-4721-874d-8f386ea24495" (UID: "38627c48-4a86-4721-874d-8f386ea24495"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.757966 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "38627c48-4a86-4721-874d-8f386ea24495" (UID: "38627c48-4a86-4721-874d-8f386ea24495"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.758124 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38627c48-4a86-4721-874d-8f386ea24495-kube-api-access-hp4x5" (OuterVolumeSpecName: "kube-api-access-hp4x5") pod "38627c48-4a86-4721-874d-8f386ea24495" (UID: "38627c48-4a86-4721-874d-8f386ea24495"). InnerVolumeSpecName "kube-api-access-hp4x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.771792 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-scripts" (OuterVolumeSpecName: "scripts") pod "38627c48-4a86-4721-874d-8f386ea24495" (UID: "38627c48-4a86-4721-874d-8f386ea24495"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.785250 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38627c48-4a86-4721-874d-8f386ea24495" (UID: "38627c48-4a86-4721-874d-8f386ea24495"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.842550 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.842577 4628 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38627c48-4a86-4721-874d-8f386ea24495-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.842587 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.842595 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp4x5\" (UniqueName: \"kubernetes.io/projected/38627c48-4a86-4721-874d-8f386ea24495-kube-api-access-hp4x5\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.842607 4628 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.848053 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-config-data" (OuterVolumeSpecName: "config-data") pod "38627c48-4a86-4721-874d-8f386ea24495" (UID: "38627c48-4a86-4721-874d-8f386ea24495"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.886444 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.944258 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-config\") pod \"36ef8b87-70d5-4803-a864-19cde1a04b87\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.944355 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-dns-svc\") pod \"36ef8b87-70d5-4803-a864-19cde1a04b87\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.947033 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-ovsdbserver-nb\") pod \"36ef8b87-70d5-4803-a864-19cde1a04b87\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.947335 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztnpl\" (UniqueName: \"kubernetes.io/projected/36ef8b87-70d5-4803-a864-19cde1a04b87-kube-api-access-ztnpl\") pod \"36ef8b87-70d5-4803-a864-19cde1a04b87\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.947381 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-ovsdbserver-sb\") pod \"36ef8b87-70d5-4803-a864-19cde1a04b87\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.947407 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-dns-swift-storage-0\") pod \"36ef8b87-70d5-4803-a864-19cde1a04b87\" (UID: \"36ef8b87-70d5-4803-a864-19cde1a04b87\") " Dec 11 05:33:02 crc kubenswrapper[4628]: I1211 05:33:02.948355 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38627c48-4a86-4721-874d-8f386ea24495-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.001086 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ef8b87-70d5-4803-a864-19cde1a04b87-kube-api-access-ztnpl" (OuterVolumeSpecName: "kube-api-access-ztnpl") pod "36ef8b87-70d5-4803-a864-19cde1a04b87" (UID: "36ef8b87-70d5-4803-a864-19cde1a04b87"). InnerVolumeSpecName "kube-api-access-ztnpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.030498 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "36ef8b87-70d5-4803-a864-19cde1a04b87" (UID: "36ef8b87-70d5-4803-a864-19cde1a04b87"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.035375 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "36ef8b87-70d5-4803-a864-19cde1a04b87" (UID: "36ef8b87-70d5-4803-a864-19cde1a04b87"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.039673 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "36ef8b87-70d5-4803-a864-19cde1a04b87" (UID: "36ef8b87-70d5-4803-a864-19cde1a04b87"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.049728 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztnpl\" (UniqueName: \"kubernetes.io/projected/36ef8b87-70d5-4803-a864-19cde1a04b87-kube-api-access-ztnpl\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.049879 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.049890 4628 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.049899 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.054757 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-config" (OuterVolumeSpecName: "config") pod "36ef8b87-70d5-4803-a864-19cde1a04b87" (UID: "36ef8b87-70d5-4803-a864-19cde1a04b87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.064986 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "36ef8b87-70d5-4803-a864-19cde1a04b87" (UID: "36ef8b87-70d5-4803-a864-19cde1a04b87"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.139596 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.152167 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.152209 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ef8b87-70d5-4803-a864-19cde1a04b87-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.253368 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-ovndb-tls-certs\") pod \"32a1d94f-dcff-4648-8b3e-0b54cf493211\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.253518 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-combined-ca-bundle\") pod \"32a1d94f-dcff-4648-8b3e-0b54cf493211\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.253614 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvbpx\" (UniqueName: \"kubernetes.io/projected/32a1d94f-dcff-4648-8b3e-0b54cf493211-kube-api-access-cvbpx\") pod \"32a1d94f-dcff-4648-8b3e-0b54cf493211\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.253654 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-httpd-config\") pod \"32a1d94f-dcff-4648-8b3e-0b54cf493211\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.253686 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-config\") pod \"32a1d94f-dcff-4648-8b3e-0b54cf493211\" (UID: \"32a1d94f-dcff-4648-8b3e-0b54cf493211\") " Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.257806 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "32a1d94f-dcff-4648-8b3e-0b54cf493211" (UID: "32a1d94f-dcff-4648-8b3e-0b54cf493211"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.258737 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a1d94f-dcff-4648-8b3e-0b54cf493211-kube-api-access-cvbpx" (OuterVolumeSpecName: "kube-api-access-cvbpx") pod "32a1d94f-dcff-4648-8b3e-0b54cf493211" (UID: "32a1d94f-dcff-4648-8b3e-0b54cf493211"). InnerVolumeSpecName "kube-api-access-cvbpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.290193 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f4cbc8496-gqwsj" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.290560 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f4cbc8496-gqwsj" event={"ID":"32a1d94f-dcff-4648-8b3e-0b54cf493211","Type":"ContainerDied","Data":"e9326b9c6124a11357a334c5cc7389070bd8e358da272bffbce67fa77ca421c9"} Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.290649 4628 scope.go:117] "RemoveContainer" containerID="ce845a4eb1c022bffc53c8bb0332d1e2eeb8cd5361b82915aba90b6b40ff16f9" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.296625 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" event={"ID":"36ef8b87-70d5-4803-a864-19cde1a04b87","Type":"ContainerDied","Data":"35b2805a276feea8e1206590cd897e21d12830ac9f16d8daa13112296f352af8"} Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.296725 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-lh2sk" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.300396 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n2b6t" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.301864 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n2b6t" event={"ID":"38627c48-4a86-4721-874d-8f386ea24495","Type":"ContainerDied","Data":"fe1f836871e8360ac99a4544a603fba688b30bd48c460659afb8be5e1d95bb68"} Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.301905 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe1f836871e8360ac99a4544a603fba688b30bd48c460659afb8be5e1d95bb68" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.303259 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48b1c132-b854-4494-9e51-d934e9946366" containerName="ceilometer-notification-agent" containerID="cri-o://d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2" gracePeriod=30 Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.303492 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="48b1c132-b854-4494-9e51-d934e9946366" containerName="sg-core" containerID="cri-o://2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130" gracePeriod=30 Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.332990 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32a1d94f-dcff-4648-8b3e-0b54cf493211" (UID: "32a1d94f-dcff-4648-8b3e-0b54cf493211"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.338183 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-config" (OuterVolumeSpecName: "config") pod "32a1d94f-dcff-4648-8b3e-0b54cf493211" (UID: "32a1d94f-dcff-4648-8b3e-0b54cf493211"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.355298 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvbpx\" (UniqueName: \"kubernetes.io/projected/32a1d94f-dcff-4648-8b3e-0b54cf493211-kube-api-access-cvbpx\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.355320 4628 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.355329 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.355339 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.355377 4628 scope.go:117] "RemoveContainer" containerID="397390f28ff14aee582539906e70c1bfcf7d07ac5cf0aa724a9d0e6e5033daab" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.376302 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-lh2sk"] Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.389067 4628 scope.go:117] "RemoveContainer" containerID="c844db229559a48791024563eb13063999c90e5c8aff7101cb52ccbaf156c0d6" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.391608 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-lh2sk"] Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.409094 4628 scope.go:117] "RemoveContainer" containerID="81d6e31888c76b189e3f1400f54dcbe72b649997247dd4879cc00d1836000919" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.411782 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "32a1d94f-dcff-4648-8b3e-0b54cf493211" (UID: "32a1d94f-dcff-4648-8b3e-0b54cf493211"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.457402 4628 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32a1d94f-dcff-4648-8b3e-0b54cf493211-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.470343 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75dfc9988b-75mkw" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:46902->10.217.0.157:9311: read: connection reset by peer" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.470664 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75dfc9988b-75mkw" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:46888->10.217.0.157:9311: read: connection reset by peer" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.625653 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f4cbc8496-gqwsj"] Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.632817 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7f4cbc8496-gqwsj"] Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.841451 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.869724 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-config-data-custom\") pod \"941752a7-2f91-4e06-97a4-3f47417006f8\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.869934 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-config-data\") pod \"941752a7-2f91-4e06-97a4-3f47417006f8\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.869970 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/941752a7-2f91-4e06-97a4-3f47417006f8-logs\") pod \"941752a7-2f91-4e06-97a4-3f47417006f8\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.869990 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-combined-ca-bundle\") pod \"941752a7-2f91-4e06-97a4-3f47417006f8\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.870072 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8557\" (UniqueName: \"kubernetes.io/projected/941752a7-2f91-4e06-97a4-3f47417006f8-kube-api-access-z8557\") pod \"941752a7-2f91-4e06-97a4-3f47417006f8\" (UID: \"941752a7-2f91-4e06-97a4-3f47417006f8\") " Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.870818 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/941752a7-2f91-4e06-97a4-3f47417006f8-logs" (OuterVolumeSpecName: "logs") pod "941752a7-2f91-4e06-97a4-3f47417006f8" (UID: "941752a7-2f91-4e06-97a4-3f47417006f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.927736 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "941752a7-2f91-4e06-97a4-3f47417006f8" (UID: "941752a7-2f91-4e06-97a4-3f47417006f8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.962225 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/941752a7-2f91-4e06-97a4-3f47417006f8-kube-api-access-z8557" (OuterVolumeSpecName: "kube-api-access-z8557") pod "941752a7-2f91-4e06-97a4-3f47417006f8" (UID: "941752a7-2f91-4e06-97a4-3f47417006f8"). InnerVolumeSpecName "kube-api-access-z8557". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.964830 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a1d94f-dcff-4648-8b3e-0b54cf493211" path="/var/lib/kubelet/pods/32a1d94f-dcff-4648-8b3e-0b54cf493211/volumes" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.965781 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36ef8b87-70d5-4803-a864-19cde1a04b87" path="/var/lib/kubelet/pods/36ef8b87-70d5-4803-a864-19cde1a04b87/volumes" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.975959 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "941752a7-2f91-4e06-97a4-3f47417006f8" (UID: "941752a7-2f91-4e06-97a4-3f47417006f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.983406 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8557\" (UniqueName: \"kubernetes.io/projected/941752a7-2f91-4e06-97a4-3f47417006f8-kube-api-access-z8557\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.983429 4628 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.983437 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/941752a7-2f91-4e06-97a4-3f47417006f8-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:03 crc kubenswrapper[4628]: I1211 05:33:03.983449 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.064765 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-config-data" (OuterVolumeSpecName: "config-data") pod "941752a7-2f91-4e06-97a4-3f47417006f8" (UID: "941752a7-2f91-4e06-97a4-3f47417006f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.084630 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941752a7-2f91-4e06-97a4-3f47417006f8-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.100972 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 05:33:04 crc kubenswrapper[4628]: E1211 05:33:04.101328 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" containerName="barbican-api-log" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101345 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" containerName="barbican-api-log" Dec 11 05:33:04 crc kubenswrapper[4628]: E1211 05:33:04.101376 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38627c48-4a86-4721-874d-8f386ea24495" containerName="cinder-db-sync" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101383 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="38627c48-4a86-4721-874d-8f386ea24495" containerName="cinder-db-sync" Dec 11 05:33:04 crc kubenswrapper[4628]: E1211 05:33:04.101393 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ef8b87-70d5-4803-a864-19cde1a04b87" containerName="init" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101398 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ef8b87-70d5-4803-a864-19cde1a04b87" containerName="init" Dec 11 05:33:04 crc kubenswrapper[4628]: E1211 05:33:04.101416 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a1d94f-dcff-4648-8b3e-0b54cf493211" containerName="neutron-api" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101427 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a1d94f-dcff-4648-8b3e-0b54cf493211" containerName="neutron-api" Dec 11 05:33:04 crc kubenswrapper[4628]: E1211 05:33:04.101438 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ef8b87-70d5-4803-a864-19cde1a04b87" containerName="dnsmasq-dns" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101444 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ef8b87-70d5-4803-a864-19cde1a04b87" containerName="dnsmasq-dns" Dec 11 05:33:04 crc kubenswrapper[4628]: E1211 05:33:04.101453 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" containerName="barbican-api" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101459 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" containerName="barbican-api" Dec 11 05:33:04 crc kubenswrapper[4628]: E1211 05:33:04.101474 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a1d94f-dcff-4648-8b3e-0b54cf493211" containerName="neutron-httpd" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101479 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a1d94f-dcff-4648-8b3e-0b54cf493211" containerName="neutron-httpd" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101641 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" containerName="barbican-api" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101654 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" containerName="barbican-api-log" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101671 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a1d94f-dcff-4648-8b3e-0b54cf493211" containerName="neutron-httpd" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101683 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="38627c48-4a86-4721-874d-8f386ea24495" containerName="cinder-db-sync" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101694 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a1d94f-dcff-4648-8b3e-0b54cf493211" containerName="neutron-api" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.101703 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="36ef8b87-70d5-4803-a864-19cde1a04b87" containerName="dnsmasq-dns" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.102515 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.102533 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-bqcv5"] Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.103492 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-bqcv5"] Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.103566 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.104212 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.114794 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-57cjn" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.115221 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.115665 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.115937 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.186010 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.186057 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkfzj\" (UniqueName: \"kubernetes.io/projected/46f03c63-6732-4757-8685-0742a4c25590-kube-api-access-mkfzj\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.186083 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-config\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.186125 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-config-data\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.186164 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82198259-8146-4ae9-b228-60a530d47fa0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.186179 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.186201 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq9z6\" (UniqueName: \"kubernetes.io/projected/82198259-8146-4ae9-b228-60a530d47fa0-kube-api-access-pq9z6\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.186216 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.186232 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.186248 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.186262 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-scripts\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.186286 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.243571 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.245003 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.247309 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.258219 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.287923 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-config-data\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.287976 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-scripts\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288014 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7wvm\" (UniqueName: \"kubernetes.io/projected/e7f37951-8c87-4a67-93d4-c3f52faf992f-kube-api-access-t7wvm\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288034 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82198259-8146-4ae9-b228-60a530d47fa0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288052 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288075 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq9z6\" (UniqueName: \"kubernetes.io/projected/82198259-8146-4ae9-b228-60a530d47fa0-kube-api-access-pq9z6\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288098 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-config-data\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288116 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288133 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288154 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288171 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-scripts\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288198 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288234 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288252 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7f37951-8c87-4a67-93d4-c3f52faf992f-logs\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288267 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-config-data-custom\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288292 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288309 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkfzj\" (UniqueName: \"kubernetes.io/projected/46f03c63-6732-4757-8685-0742a4c25590-kube-api-access-mkfzj\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288332 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7f37951-8c87-4a67-93d4-c3f52faf992f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.288349 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-config\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.289166 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-config\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.289679 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.290340 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.290583 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82198259-8146-4ae9-b228-60a530d47fa0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.290711 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.292303 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.299667 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-scripts\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.302435 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-config-data\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.302965 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.303811 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.311517 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkfzj\" (UniqueName: \"kubernetes.io/projected/46f03c63-6732-4757-8685-0742a4c25590-kube-api-access-mkfzj\") pod \"dnsmasq-dns-5c9776ccc5-bqcv5\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.311818 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq9z6\" (UniqueName: \"kubernetes.io/projected/82198259-8146-4ae9-b228-60a530d47fa0-kube-api-access-pq9z6\") pod \"cinder-scheduler-0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.322991 4628 generic.go:334] "Generic (PLEG): container finished" podID="941752a7-2f91-4e06-97a4-3f47417006f8" containerID="c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e" exitCode=0 Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.323056 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75dfc9988b-75mkw" event={"ID":"941752a7-2f91-4e06-97a4-3f47417006f8","Type":"ContainerDied","Data":"c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e"} Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.323080 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75dfc9988b-75mkw" event={"ID":"941752a7-2f91-4e06-97a4-3f47417006f8","Type":"ContainerDied","Data":"bd1ca41441ed2909f033a1fc363b5927e39487e612ebd1add956af235fa18f05"} Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.323098 4628 scope.go:117] "RemoveContainer" containerID="c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.323224 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75dfc9988b-75mkw" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.337306 4628 generic.go:334] "Generic (PLEG): container finished" podID="48b1c132-b854-4494-9e51-d934e9946366" containerID="2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130" exitCode=2 Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.337352 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48b1c132-b854-4494-9e51-d934e9946366","Type":"ContainerDied","Data":"2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130"} Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.390019 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.390061 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7f37951-8c87-4a67-93d4-c3f52faf992f-logs\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.390100 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-config-data-custom\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.390171 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7f37951-8c87-4a67-93d4-c3f52faf992f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.390256 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-scripts\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.390305 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7wvm\" (UniqueName: \"kubernetes.io/projected/e7f37951-8c87-4a67-93d4-c3f52faf992f-kube-api-access-t7wvm\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.390356 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-config-data\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.390686 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7f37951-8c87-4a67-93d4-c3f52faf992f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.391039 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7f37951-8c87-4a67-93d4-c3f52faf992f-logs\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.394982 4628 scope.go:117] "RemoveContainer" containerID="37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.399089 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-scripts\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.408889 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75dfc9988b-75mkw"] Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.416503 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-75dfc9988b-75mkw"] Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.417407 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.418281 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-config-data\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.422765 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7wvm\" (UniqueName: \"kubernetes.io/projected/e7f37951-8c87-4a67-93d4-c3f52faf992f-kube-api-access-t7wvm\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.425506 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-config-data-custom\") pod \"cinder-api-0\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.452153 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.457017 4628 scope.go:117] "RemoveContainer" containerID="c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e" Dec 11 05:33:04 crc kubenswrapper[4628]: E1211 05:33:04.461026 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e\": container with ID starting with c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e not found: ID does not exist" containerID="c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.461072 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e"} err="failed to get container status \"c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e\": rpc error: code = NotFound desc = could not find container \"c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e\": container with ID starting with c31d2c821d8238f044f165c3224e831dd40e65fbfd4b786c97175686094a522e not found: ID does not exist" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.461097 4628 scope.go:117] "RemoveContainer" containerID="37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd" Dec 11 05:33:04 crc kubenswrapper[4628]: E1211 05:33:04.465959 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd\": container with ID starting with 37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd not found: ID does not exist" containerID="37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.466005 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd"} err="failed to get container status \"37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd\": rpc error: code = NotFound desc = could not find container \"37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd\": container with ID starting with 37d42a7cd50ef5b4985065bbcb8c179a8164804d0231545865b08692691418dd not found: ID does not exist" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.474805 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.587189 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 11 05:33:04 crc kubenswrapper[4628]: I1211 05:33:04.933084 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-bqcv5"] Dec 11 05:33:05 crc kubenswrapper[4628]: I1211 05:33:05.059281 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 05:33:05 crc kubenswrapper[4628]: I1211 05:33:05.097998 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 11 05:33:05 crc kubenswrapper[4628]: I1211 05:33:05.345187 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82198259-8146-4ae9-b228-60a530d47fa0","Type":"ContainerStarted","Data":"1133e0e0abfb182247c00c0a59c6451a7dd3051513023e09a292f0c26bd4a166"} Dec 11 05:33:05 crc kubenswrapper[4628]: I1211 05:33:05.347686 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e7f37951-8c87-4a67-93d4-c3f52faf992f","Type":"ContainerStarted","Data":"c4f880ce032bffe479319b53690f0cb75245fd94dedfb7236ae6f921993f8acd"} Dec 11 05:33:05 crc kubenswrapper[4628]: I1211 05:33:05.354577 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" event={"ID":"46f03c63-6732-4757-8685-0742a4c25590","Type":"ContainerStarted","Data":"25a17ebd1255ab860e6b40b219202f123e660645ac711bac9310f5fc4d069f32"} Dec 11 05:33:05 crc kubenswrapper[4628]: I1211 05:33:05.354609 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" event={"ID":"46f03c63-6732-4757-8685-0742a4c25590","Type":"ContainerStarted","Data":"3bfc7244561f6336396b09f287fb1a31fb76ec873a0d7c2b9647046761e0f002"} Dec 11 05:33:05 crc kubenswrapper[4628]: I1211 05:33:05.899333 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="941752a7-2f91-4e06-97a4-3f47417006f8" path="/var/lib/kubelet/pods/941752a7-2f91-4e06-97a4-3f47417006f8/volumes" Dec 11 05:33:06 crc kubenswrapper[4628]: I1211 05:33:06.073324 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 11 05:33:06 crc kubenswrapper[4628]: I1211 05:33:06.396752 4628 generic.go:334] "Generic (PLEG): container finished" podID="46f03c63-6732-4757-8685-0742a4c25590" containerID="25a17ebd1255ab860e6b40b219202f123e660645ac711bac9310f5fc4d069f32" exitCode=0 Dec 11 05:33:06 crc kubenswrapper[4628]: I1211 05:33:06.396896 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" event={"ID":"46f03c63-6732-4757-8685-0742a4c25590","Type":"ContainerDied","Data":"25a17ebd1255ab860e6b40b219202f123e660645ac711bac9310f5fc4d069f32"} Dec 11 05:33:06 crc kubenswrapper[4628]: I1211 05:33:06.406183 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e7f37951-8c87-4a67-93d4-c3f52faf992f","Type":"ContainerStarted","Data":"9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602"} Dec 11 05:33:07 crc kubenswrapper[4628]: I1211 05:33:07.422025 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" event={"ID":"46f03c63-6732-4757-8685-0742a4c25590","Type":"ContainerStarted","Data":"592581898fa37bdfd613617f3d99d3153ec464d171002c628d5d24835a4b33e5"} Dec 11 05:33:07 crc kubenswrapper[4628]: I1211 05:33:07.422599 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:07 crc kubenswrapper[4628]: I1211 05:33:07.425658 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82198259-8146-4ae9-b228-60a530d47fa0","Type":"ContainerStarted","Data":"ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1"} Dec 11 05:33:07 crc kubenswrapper[4628]: I1211 05:33:07.428116 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e7f37951-8c87-4a67-93d4-c3f52faf992f","Type":"ContainerStarted","Data":"55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a"} Dec 11 05:33:07 crc kubenswrapper[4628]: I1211 05:33:07.428285 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e7f37951-8c87-4a67-93d4-c3f52faf992f" containerName="cinder-api-log" containerID="cri-o://9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602" gracePeriod=30 Dec 11 05:33:07 crc kubenswrapper[4628]: I1211 05:33:07.428468 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 11 05:33:07 crc kubenswrapper[4628]: I1211 05:33:07.428500 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e7f37951-8c87-4a67-93d4-c3f52faf992f" containerName="cinder-api" containerID="cri-o://55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a" gracePeriod=30 Dec 11 05:33:07 crc kubenswrapper[4628]: I1211 05:33:07.446391 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" podStartSLOduration=4.4463747510000005 podStartE2EDuration="4.446374751s" podCreationTimestamp="2025-12-11 05:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:33:07.443533864 +0000 UTC m=+1089.860880562" watchObservedRunningTime="2025-12-11 05:33:07.446374751 +0000 UTC m=+1089.863721449" Dec 11 05:33:07 crc kubenswrapper[4628]: I1211 05:33:07.478496 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.478474122 podStartE2EDuration="3.478474122s" podCreationTimestamp="2025-12-11 05:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:33:07.465839939 +0000 UTC m=+1089.883186637" watchObservedRunningTime="2025-12-11 05:33:07.478474122 +0000 UTC m=+1089.895820820" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.157078 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.221123 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48b1c132-b854-4494-9e51-d934e9946366-run-httpd\") pod \"48b1c132-b854-4494-9e51-d934e9946366\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.221189 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-sg-core-conf-yaml\") pod \"48b1c132-b854-4494-9e51-d934e9946366\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.221252 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-scripts\") pod \"48b1c132-b854-4494-9e51-d934e9946366\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.221271 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48b1c132-b854-4494-9e51-d934e9946366-log-httpd\") pod \"48b1c132-b854-4494-9e51-d934e9946366\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.221361 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-config-data\") pod \"48b1c132-b854-4494-9e51-d934e9946366\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.221458 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-combined-ca-bundle\") pod \"48b1c132-b854-4494-9e51-d934e9946366\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.221490 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njt7h\" (UniqueName: \"kubernetes.io/projected/48b1c132-b854-4494-9e51-d934e9946366-kube-api-access-njt7h\") pod \"48b1c132-b854-4494-9e51-d934e9946366\" (UID: \"48b1c132-b854-4494-9e51-d934e9946366\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.222756 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48b1c132-b854-4494-9e51-d934e9946366-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "48b1c132-b854-4494-9e51-d934e9946366" (UID: "48b1c132-b854-4494-9e51-d934e9946366"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.224350 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48b1c132-b854-4494-9e51-d934e9946366-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "48b1c132-b854-4494-9e51-d934e9946366" (UID: "48b1c132-b854-4494-9e51-d934e9946366"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.250569 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b1c132-b854-4494-9e51-d934e9946366-kube-api-access-njt7h" (OuterVolumeSpecName: "kube-api-access-njt7h") pod "48b1c132-b854-4494-9e51-d934e9946366" (UID: "48b1c132-b854-4494-9e51-d934e9946366"). InnerVolumeSpecName "kube-api-access-njt7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.259110 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-scripts" (OuterVolumeSpecName: "scripts") pod "48b1c132-b854-4494-9e51-d934e9946366" (UID: "48b1c132-b854-4494-9e51-d934e9946366"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.291334 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-config-data" (OuterVolumeSpecName: "config-data") pod "48b1c132-b854-4494-9e51-d934e9946366" (UID: "48b1c132-b854-4494-9e51-d934e9946366"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.309958 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "48b1c132-b854-4494-9e51-d934e9946366" (UID: "48b1c132-b854-4494-9e51-d934e9946366"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.320515 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48b1c132-b854-4494-9e51-d934e9946366" (UID: "48b1c132-b854-4494-9e51-d934e9946366"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.323901 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.323946 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.323957 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njt7h\" (UniqueName: \"kubernetes.io/projected/48b1c132-b854-4494-9e51-d934e9946366-kube-api-access-njt7h\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.323966 4628 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48b1c132-b854-4494-9e51-d934e9946366-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.323974 4628 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.323981 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48b1c132-b854-4494-9e51-d934e9946366-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.323989 4628 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/48b1c132-b854-4494-9e51-d934e9946366-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.339117 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.425583 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-config-data-custom\") pod \"e7f37951-8c87-4a67-93d4-c3f52faf992f\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.426332 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7f37951-8c87-4a67-93d4-c3f52faf992f-logs\") pod \"e7f37951-8c87-4a67-93d4-c3f52faf992f\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.426371 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-combined-ca-bundle\") pod \"e7f37951-8c87-4a67-93d4-c3f52faf992f\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.426510 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7wvm\" (UniqueName: \"kubernetes.io/projected/e7f37951-8c87-4a67-93d4-c3f52faf992f-kube-api-access-t7wvm\") pod \"e7f37951-8c87-4a67-93d4-c3f52faf992f\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.426540 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-config-data\") pod \"e7f37951-8c87-4a67-93d4-c3f52faf992f\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.426570 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-scripts\") pod \"e7f37951-8c87-4a67-93d4-c3f52faf992f\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.426589 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7f37951-8c87-4a67-93d4-c3f52faf992f-etc-machine-id\") pod \"e7f37951-8c87-4a67-93d4-c3f52faf992f\" (UID: \"e7f37951-8c87-4a67-93d4-c3f52faf992f\") " Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.427097 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7f37951-8c87-4a67-93d4-c3f52faf992f-logs" (OuterVolumeSpecName: "logs") pod "e7f37951-8c87-4a67-93d4-c3f52faf992f" (UID: "e7f37951-8c87-4a67-93d4-c3f52faf992f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.427375 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7f37951-8c87-4a67-93d4-c3f52faf992f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e7f37951-8c87-4a67-93d4-c3f52faf992f" (UID: "e7f37951-8c87-4a67-93d4-c3f52faf992f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.427679 4628 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7f37951-8c87-4a67-93d4-c3f52faf992f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.427700 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7f37951-8c87-4a67-93d4-c3f52faf992f-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.428481 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e7f37951-8c87-4a67-93d4-c3f52faf992f" (UID: "e7f37951-8c87-4a67-93d4-c3f52faf992f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.433851 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f37951-8c87-4a67-93d4-c3f52faf992f-kube-api-access-t7wvm" (OuterVolumeSpecName: "kube-api-access-t7wvm") pod "e7f37951-8c87-4a67-93d4-c3f52faf992f" (UID: "e7f37951-8c87-4a67-93d4-c3f52faf992f"). InnerVolumeSpecName "kube-api-access-t7wvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.439897 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-scripts" (OuterVolumeSpecName: "scripts") pod "e7f37951-8c87-4a67-93d4-c3f52faf992f" (UID: "e7f37951-8c87-4a67-93d4-c3f52faf992f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.441440 4628 generic.go:334] "Generic (PLEG): container finished" podID="e7f37951-8c87-4a67-93d4-c3f52faf992f" containerID="55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a" exitCode=0 Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.441471 4628 generic.go:334] "Generic (PLEG): container finished" podID="e7f37951-8c87-4a67-93d4-c3f52faf992f" containerID="9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602" exitCode=143 Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.441515 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e7f37951-8c87-4a67-93d4-c3f52faf992f","Type":"ContainerDied","Data":"55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a"} Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.441542 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e7f37951-8c87-4a67-93d4-c3f52faf992f","Type":"ContainerDied","Data":"9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602"} Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.441551 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e7f37951-8c87-4a67-93d4-c3f52faf992f","Type":"ContainerDied","Data":"c4f880ce032bffe479319b53690f0cb75245fd94dedfb7236ae6f921993f8acd"} Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.441568 4628 scope.go:117] "RemoveContainer" containerID="55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.441681 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.447029 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82198259-8146-4ae9-b228-60a530d47fa0","Type":"ContainerStarted","Data":"8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73"} Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.455024 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7f37951-8c87-4a67-93d4-c3f52faf992f" (UID: "e7f37951-8c87-4a67-93d4-c3f52faf992f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.455928 4628 generic.go:334] "Generic (PLEG): container finished" podID="48b1c132-b854-4494-9e51-d934e9946366" containerID="d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2" exitCode=0 Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.456000 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.456076 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48b1c132-b854-4494-9e51-d934e9946366","Type":"ContainerDied","Data":"d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2"} Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.456145 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"48b1c132-b854-4494-9e51-d934e9946366","Type":"ContainerDied","Data":"055b42a696d531bb08da19ec1ebb0c86feb43deabf6aca8a08b2fd8bfddc67c7"} Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.468070 4628 scope.go:117] "RemoveContainer" containerID="9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.484328 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-config-data" (OuterVolumeSpecName: "config-data") pod "e7f37951-8c87-4a67-93d4-c3f52faf992f" (UID: "e7f37951-8c87-4a67-93d4-c3f52faf992f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.486327 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.069068774 podStartE2EDuration="5.486298484s" podCreationTimestamp="2025-12-11 05:33:03 +0000 UTC" firstStartedPulling="2025-12-11 05:33:05.087520587 +0000 UTC m=+1087.504867285" lastFinishedPulling="2025-12-11 05:33:06.504750297 +0000 UTC m=+1088.922096995" observedRunningTime="2025-12-11 05:33:08.467585505 +0000 UTC m=+1090.884932203" watchObservedRunningTime="2025-12-11 05:33:08.486298484 +0000 UTC m=+1090.903645182" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.521520 4628 scope.go:117] "RemoveContainer" containerID="55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a" Dec 11 05:33:08 crc kubenswrapper[4628]: E1211 05:33:08.521730 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a\": container with ID starting with 55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a not found: ID does not exist" containerID="55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.521760 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a"} err="failed to get container status \"55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a\": rpc error: code = NotFound desc = could not find container \"55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a\": container with ID starting with 55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a not found: ID does not exist" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.521779 4628 scope.go:117] "RemoveContainer" containerID="9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602" Dec 11 05:33:08 crc kubenswrapper[4628]: E1211 05:33:08.522139 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602\": container with ID starting with 9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602 not found: ID does not exist" containerID="9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.522160 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602"} err="failed to get container status \"9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602\": rpc error: code = NotFound desc = could not find container \"9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602\": container with ID starting with 9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602 not found: ID does not exist" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.522172 4628 scope.go:117] "RemoveContainer" containerID="55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.522336 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a"} err="failed to get container status \"55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a\": rpc error: code = NotFound desc = could not find container \"55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a\": container with ID starting with 55aedc9f09b39eed41ed400038b5fd70610d8bf92971d2aa1fbe926509c13d6a not found: ID does not exist" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.522350 4628 scope.go:117] "RemoveContainer" containerID="9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.522508 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602"} err="failed to get container status \"9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602\": rpc error: code = NotFound desc = could not find container \"9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602\": container with ID starting with 9dd0a873e1e326d878951e10ffdf240563d2bd11d8d76dd30d2b7d7617a82602 not found: ID does not exist" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.522521 4628 scope.go:117] "RemoveContainer" containerID="2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.530813 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.531103 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.531234 4628 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.531302 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f37951-8c87-4a67-93d4-c3f52faf992f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.531365 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7wvm\" (UniqueName: \"kubernetes.io/projected/e7f37951-8c87-4a67-93d4-c3f52faf992f-kube-api-access-t7wvm\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.543045 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.558023 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.558542 4628 scope.go:117] "RemoveContainer" containerID="d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.576793 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:08 crc kubenswrapper[4628]: E1211 05:33:08.577585 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f37951-8c87-4a67-93d4-c3f52faf992f" containerName="cinder-api-log" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.577974 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f37951-8c87-4a67-93d4-c3f52faf992f" containerName="cinder-api-log" Dec 11 05:33:08 crc kubenswrapper[4628]: E1211 05:33:08.578040 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b1c132-b854-4494-9e51-d934e9946366" containerName="ceilometer-notification-agent" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.578380 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b1c132-b854-4494-9e51-d934e9946366" containerName="ceilometer-notification-agent" Dec 11 05:33:08 crc kubenswrapper[4628]: E1211 05:33:08.578449 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b1c132-b854-4494-9e51-d934e9946366" containerName="sg-core" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.578523 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b1c132-b854-4494-9e51-d934e9946366" containerName="sg-core" Dec 11 05:33:08 crc kubenswrapper[4628]: E1211 05:33:08.578577 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f37951-8c87-4a67-93d4-c3f52faf992f" containerName="cinder-api" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.578622 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f37951-8c87-4a67-93d4-c3f52faf992f" containerName="cinder-api" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.578829 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f37951-8c87-4a67-93d4-c3f52faf992f" containerName="cinder-api" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.578911 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f37951-8c87-4a67-93d4-c3f52faf992f" containerName="cinder-api-log" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.578968 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b1c132-b854-4494-9e51-d934e9946366" containerName="sg-core" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.579027 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b1c132-b854-4494-9e51-d934e9946366" containerName="ceilometer-notification-agent" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.580650 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.582336 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.582869 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.592970 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.601535 4628 scope.go:117] "RemoveContainer" containerID="2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130" Dec 11 05:33:08 crc kubenswrapper[4628]: E1211 05:33:08.605085 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130\": container with ID starting with 2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130 not found: ID does not exist" containerID="2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.605129 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130"} err="failed to get container status \"2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130\": rpc error: code = NotFound desc = could not find container \"2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130\": container with ID starting with 2cc0d2efb2af9d076af7ad3bce70cfe8d55582bda79e759a79461ebeb68e4130 not found: ID does not exist" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.605159 4628 scope.go:117] "RemoveContainer" containerID="d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2" Dec 11 05:33:08 crc kubenswrapper[4628]: E1211 05:33:08.605750 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2\": container with ID starting with d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2 not found: ID does not exist" containerID="d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.605784 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2"} err="failed to get container status \"d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2\": rpc error: code = NotFound desc = could not find container \"d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2\": container with ID starting with d97914ed64d5ccdeb58d74607b9c2b8cd80b527c374db55d801870f988ed65c2 not found: ID does not exist" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.632429 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/872564c3-1be6-474b-907e-6527469f6e91-log-httpd\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.632646 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-scripts\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.632747 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-config-data\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.632864 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.632938 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/872564c3-1be6-474b-907e-6527469f6e91-run-httpd\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.633000 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.633076 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d6jp\" (UniqueName: \"kubernetes.io/projected/872564c3-1be6-474b-907e-6527469f6e91-kube-api-access-9d6jp\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.635517 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.707528 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.735019 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/872564c3-1be6-474b-907e-6527469f6e91-log-httpd\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.735095 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-scripts\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.735118 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-config-data\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.735203 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.735229 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/872564c3-1be6-474b-907e-6527469f6e91-run-httpd\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.735245 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.735270 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d6jp\" (UniqueName: \"kubernetes.io/projected/872564c3-1be6-474b-907e-6527469f6e91-kube-api-access-9d6jp\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.735935 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/872564c3-1be6-474b-907e-6527469f6e91-log-httpd\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.740117 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/872564c3-1be6-474b-907e-6527469f6e91-run-httpd\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.740779 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-scripts\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.741474 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.744784 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-config-data\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.745829 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.759590 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d6jp\" (UniqueName: \"kubernetes.io/projected/872564c3-1be6-474b-907e-6527469f6e91-kube-api-access-9d6jp\") pod \"ceilometer-0\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.781914 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.790185 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.801357 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.802984 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.806237 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.806435 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.809376 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.819103 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.899945 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.941637 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq5q8\" (UniqueName: \"kubernetes.io/projected/12485bc3-6a23-4772-8c00-2148c65fe10d-kube-api-access-jq5q8\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.941744 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-config-data-custom\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.941769 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-scripts\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.941783 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.941802 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.941884 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12485bc3-6a23-4772-8c00-2148c65fe10d-logs\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.941911 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12485bc3-6a23-4772-8c00-2148c65fe10d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.941941 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:08 crc kubenswrapper[4628]: I1211 05:33:08.941972 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-config-data\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.043825 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-config-data\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.044227 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq5q8\" (UniqueName: \"kubernetes.io/projected/12485bc3-6a23-4772-8c00-2148c65fe10d-kube-api-access-jq5q8\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.044347 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-config-data-custom\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.044384 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-scripts\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.044411 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.044443 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.044502 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12485bc3-6a23-4772-8c00-2148c65fe10d-logs\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.044532 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12485bc3-6a23-4772-8c00-2148c65fe10d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.044553 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.046444 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12485bc3-6a23-4772-8c00-2148c65fe10d-logs\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.046507 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12485bc3-6a23-4772-8c00-2148c65fe10d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.057618 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.062547 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.066310 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.069570 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-config-data-custom\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.070636 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-config-data\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.071127 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12485bc3-6a23-4772-8c00-2148c65fe10d-scripts\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.087995 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq5q8\" (UniqueName: \"kubernetes.io/projected/12485bc3-6a23-4772-8c00-2148c65fe10d-kube-api-access-jq5q8\") pod \"cinder-api-0\" (UID: \"12485bc3-6a23-4772-8c00-2148c65fe10d\") " pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.116752 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.455173 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.475814 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.649648 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.900258 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b1c132-b854-4494-9e51-d934e9946366" path="/var/lib/kubelet/pods/48b1c132-b854-4494-9e51-d934e9946366/volumes" Dec 11 05:33:09 crc kubenswrapper[4628]: I1211 05:33:09.901270 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f37951-8c87-4a67-93d4-c3f52faf992f" path="/var/lib/kubelet/pods/e7f37951-8c87-4a67-93d4-c3f52faf992f/volumes" Dec 11 05:33:10 crc kubenswrapper[4628]: I1211 05:33:10.505581 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"872564c3-1be6-474b-907e-6527469f6e91","Type":"ContainerStarted","Data":"da54bb72b38f7fcc13d1e5bdf98a7d0686be9540c61b7c1bdb57246f63a052cb"} Dec 11 05:33:10 crc kubenswrapper[4628]: I1211 05:33:10.506240 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"872564c3-1be6-474b-907e-6527469f6e91","Type":"ContainerStarted","Data":"61e61f92a6b857821f851ba628074486257ede50ab9470770310ad9fd4ef9d32"} Dec 11 05:33:10 crc kubenswrapper[4628]: I1211 05:33:10.513373 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"12485bc3-6a23-4772-8c00-2148c65fe10d","Type":"ContainerStarted","Data":"2e5f183a3d67730aa18ae92c09d8c5ea5e74729b0a5334be2164af2bfb6af364"} Dec 11 05:33:10 crc kubenswrapper[4628]: I1211 05:33:10.513408 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"12485bc3-6a23-4772-8c00-2148c65fe10d","Type":"ContainerStarted","Data":"22df04c53f9004b6b526c644cac7fc48b7b6d32658a678b339bff842a858b90f"} Dec 11 05:33:10 crc kubenswrapper[4628]: I1211 05:33:10.596635 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:33:10 crc kubenswrapper[4628]: I1211 05:33:10.597743 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7989644c86-scmh4" Dec 11 05:33:10 crc kubenswrapper[4628]: I1211 05:33:10.686028 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66bdd9d8cd-mgd96"] Dec 11 05:33:11 crc kubenswrapper[4628]: I1211 05:33:11.522908 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"872564c3-1be6-474b-907e-6527469f6e91","Type":"ContainerStarted","Data":"fb604958834fbf7db528a4e60e454b36db0e55acdc4a06445a5e1e564d4ec0ac"} Dec 11 05:33:11 crc kubenswrapper[4628]: I1211 05:33:11.525034 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"12485bc3-6a23-4772-8c00-2148c65fe10d","Type":"ContainerStarted","Data":"fe23e90d773232bd97629fb9b4658000c17128cbbaa7c310cdf27f54b2ff70cf"} Dec 11 05:33:11 crc kubenswrapper[4628]: I1211 05:33:11.525151 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 11 05:33:11 crc kubenswrapper[4628]: I1211 05:33:11.525390 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66bdd9d8cd-mgd96" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon-log" containerID="cri-o://b60d821722cdfc3fd82f6785ddf1b0a7349d9bd58013594052c7fa0d037fb3be" gracePeriod=30 Dec 11 05:33:11 crc kubenswrapper[4628]: I1211 05:33:11.525438 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66bdd9d8cd-mgd96" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon" containerID="cri-o://a746c66aade8058642983deeedde27bfe41ebbcf4cc43d9cec8d1a2cd699c9e3" gracePeriod=30 Dec 11 05:33:13 crc kubenswrapper[4628]: I1211 05:33:13.548402 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"872564c3-1be6-474b-907e-6527469f6e91","Type":"ContainerStarted","Data":"7d659aa18233c021f160afa14611f38a271bc5e040b9dabc7146d79882b3cf37"} Dec 11 05:33:14 crc kubenswrapper[4628]: I1211 05:33:14.453977 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:33:14 crc kubenswrapper[4628]: I1211 05:33:14.482788 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.482763041 podStartE2EDuration="6.482763041s" podCreationTimestamp="2025-12-11 05:33:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:33:11.560125905 +0000 UTC m=+1093.977472603" watchObservedRunningTime="2025-12-11 05:33:14.482763041 +0000 UTC m=+1096.900109759" Dec 11 05:33:14 crc kubenswrapper[4628]: I1211 05:33:14.583106 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"872564c3-1be6-474b-907e-6527469f6e91","Type":"ContainerStarted","Data":"734c8007c302e5b056bf574f64f824b53a62f23ebe182eae5a2fe4a5146e4cc5"} Dec 11 05:33:14 crc kubenswrapper[4628]: I1211 05:33:14.584123 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 05:33:14 crc kubenswrapper[4628]: I1211 05:33:14.602612 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-t6ntr"] Dec 11 05:33:14 crc kubenswrapper[4628]: I1211 05:33:14.602942 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" podUID="8161e0eb-57ee-447e-9427-2e93432ff767" containerName="dnsmasq-dns" containerID="cri-o://c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef" gracePeriod=10 Dec 11 05:33:14 crc kubenswrapper[4628]: I1211 05:33:14.618024 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.378274488 podStartE2EDuration="6.617996663s" podCreationTimestamp="2025-12-11 05:33:08 +0000 UTC" firstStartedPulling="2025-12-11 05:33:09.474071231 +0000 UTC m=+1091.891417929" lastFinishedPulling="2025-12-11 05:33:13.713793396 +0000 UTC m=+1096.131140104" observedRunningTime="2025-12-11 05:33:14.616364469 +0000 UTC m=+1097.033711167" watchObservedRunningTime="2025-12-11 05:33:14.617996663 +0000 UTC m=+1097.035343361" Dec 11 05:33:14 crc kubenswrapper[4628]: I1211 05:33:14.908769 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 11 05:33:14 crc kubenswrapper[4628]: I1211 05:33:14.963961 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.099326 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66bdd9d8cd-mgd96" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.205883 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.303787 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-ovsdbserver-sb\") pod \"8161e0eb-57ee-447e-9427-2e93432ff767\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.303883 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-ovsdbserver-nb\") pod \"8161e0eb-57ee-447e-9427-2e93432ff767\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.303955 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmlf2\" (UniqueName: \"kubernetes.io/projected/8161e0eb-57ee-447e-9427-2e93432ff767-kube-api-access-hmlf2\") pod \"8161e0eb-57ee-447e-9427-2e93432ff767\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.304069 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-dns-svc\") pod \"8161e0eb-57ee-447e-9427-2e93432ff767\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.304123 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-config\") pod \"8161e0eb-57ee-447e-9427-2e93432ff767\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.304159 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-dns-swift-storage-0\") pod \"8161e0eb-57ee-447e-9427-2e93432ff767\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.368015 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8161e0eb-57ee-447e-9427-2e93432ff767-kube-api-access-hmlf2" (OuterVolumeSpecName: "kube-api-access-hmlf2") pod "8161e0eb-57ee-447e-9427-2e93432ff767" (UID: "8161e0eb-57ee-447e-9427-2e93432ff767"). InnerVolumeSpecName "kube-api-access-hmlf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.408320 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmlf2\" (UniqueName: \"kubernetes.io/projected/8161e0eb-57ee-447e-9427-2e93432ff767-kube-api-access-hmlf2\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.470825 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8161e0eb-57ee-447e-9427-2e93432ff767" (UID: "8161e0eb-57ee-447e-9427-2e93432ff767"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.471145 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8161e0eb-57ee-447e-9427-2e93432ff767" (UID: "8161e0eb-57ee-447e-9427-2e93432ff767"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.471404 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8161e0eb-57ee-447e-9427-2e93432ff767" (UID: "8161e0eb-57ee-447e-9427-2e93432ff767"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:15 crc kubenswrapper[4628]: E1211 05:33:15.509218 4628 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-config podName:8161e0eb-57ee-447e-9427-2e93432ff767 nodeName:}" failed. No retries permitted until 2025-12-11 05:33:16.009190767 +0000 UTC m=+1098.426537465 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-config") pod "8161e0eb-57ee-447e-9427-2e93432ff767" (UID: "8161e0eb-57ee-447e-9427-2e93432ff767") : error deleting /var/lib/kubelet/pods/8161e0eb-57ee-447e-9427-2e93432ff767/volume-subpaths: remove /var/lib/kubelet/pods/8161e0eb-57ee-447e-9427-2e93432ff767/volume-subpaths: no such file or directory Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.509536 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8161e0eb-57ee-447e-9427-2e93432ff767" (UID: "8161e0eb-57ee-447e-9427-2e93432ff767"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.510541 4628 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.510558 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.510567 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.510577 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.593918 4628 generic.go:334] "Generic (PLEG): container finished" podID="8161e0eb-57ee-447e-9427-2e93432ff767" containerID="c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef" exitCode=0 Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.593982 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.594013 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" event={"ID":"8161e0eb-57ee-447e-9427-2e93432ff767","Type":"ContainerDied","Data":"c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef"} Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.594913 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-t6ntr" event={"ID":"8161e0eb-57ee-447e-9427-2e93432ff767","Type":"ContainerDied","Data":"7b5eb76b80be93d2b0cd7987b7e5348afdb6fb137d7883099633f28f73444119"} Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.594950 4628 scope.go:117] "RemoveContainer" containerID="c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.597397 4628 generic.go:334] "Generic (PLEG): container finished" podID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerID="a746c66aade8058642983deeedde27bfe41ebbcf4cc43d9cec8d1a2cd699c9e3" exitCode=0 Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.597513 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66bdd9d8cd-mgd96" event={"ID":"8a3522a5-42e8-46ba-b794-d23582baa2a4","Type":"ContainerDied","Data":"a746c66aade8058642983deeedde27bfe41ebbcf4cc43d9cec8d1a2cd699c9e3"} Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.597802 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="82198259-8146-4ae9-b228-60a530d47fa0" containerName="cinder-scheduler" containerID="cri-o://ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1" gracePeriod=30 Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.597863 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="82198259-8146-4ae9-b228-60a530d47fa0" containerName="probe" containerID="cri-o://8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73" gracePeriod=30 Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.619061 4628 scope.go:117] "RemoveContainer" containerID="961c0d0a8476731fc6fbfeb66baeb96663d5d22a13200ab6e0d719330848b431" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.648095 4628 scope.go:117] "RemoveContainer" containerID="c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef" Dec 11 05:33:15 crc kubenswrapper[4628]: E1211 05:33:15.648499 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef\": container with ID starting with c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef not found: ID does not exist" containerID="c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.648559 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef"} err="failed to get container status \"c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef\": rpc error: code = NotFound desc = could not find container \"c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef\": container with ID starting with c6b73587c64433dc7884e9c392d10563099e6a91905f0725da0dffee46264fef not found: ID does not exist" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.648588 4628 scope.go:117] "RemoveContainer" containerID="961c0d0a8476731fc6fbfeb66baeb96663d5d22a13200ab6e0d719330848b431" Dec 11 05:33:15 crc kubenswrapper[4628]: E1211 05:33:15.649249 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"961c0d0a8476731fc6fbfeb66baeb96663d5d22a13200ab6e0d719330848b431\": container with ID starting with 961c0d0a8476731fc6fbfeb66baeb96663d5d22a13200ab6e0d719330848b431 not found: ID does not exist" containerID="961c0d0a8476731fc6fbfeb66baeb96663d5d22a13200ab6e0d719330848b431" Dec 11 05:33:15 crc kubenswrapper[4628]: I1211 05:33:15.649286 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961c0d0a8476731fc6fbfeb66baeb96663d5d22a13200ab6e0d719330848b431"} err="failed to get container status \"961c0d0a8476731fc6fbfeb66baeb96663d5d22a13200ab6e0d719330848b431\": rpc error: code = NotFound desc = could not find container \"961c0d0a8476731fc6fbfeb66baeb96663d5d22a13200ab6e0d719330848b431\": container with ID starting with 961c0d0a8476731fc6fbfeb66baeb96663d5d22a13200ab6e0d719330848b431 not found: ID does not exist" Dec 11 05:33:16 crc kubenswrapper[4628]: I1211 05:33:16.020484 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-config\") pod \"8161e0eb-57ee-447e-9427-2e93432ff767\" (UID: \"8161e0eb-57ee-447e-9427-2e93432ff767\") " Dec 11 05:33:16 crc kubenswrapper[4628]: I1211 05:33:16.021194 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-config" (OuterVolumeSpecName: "config") pod "8161e0eb-57ee-447e-9427-2e93432ff767" (UID: "8161e0eb-57ee-447e-9427-2e93432ff767"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:16 crc kubenswrapper[4628]: I1211 05:33:16.123953 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8161e0eb-57ee-447e-9427-2e93432ff767-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:16 crc kubenswrapper[4628]: I1211 05:33:16.224181 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-t6ntr"] Dec 11 05:33:16 crc kubenswrapper[4628]: I1211 05:33:16.230348 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-t6ntr"] Dec 11 05:33:16 crc kubenswrapper[4628]: I1211 05:33:16.609460 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82198259-8146-4ae9-b228-60a530d47fa0","Type":"ContainerDied","Data":"8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73"} Dec 11 05:33:16 crc kubenswrapper[4628]: I1211 05:33:16.609771 4628 generic.go:334] "Generic (PLEG): container finished" podID="82198259-8146-4ae9-b228-60a530d47fa0" containerID="8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73" exitCode=0 Dec 11 05:33:16 crc kubenswrapper[4628]: I1211 05:33:16.980309 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-589776fd4-wpbmd" Dec 11 05:33:17 crc kubenswrapper[4628]: I1211 05:33:17.879419 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55f69568c9-2p2zq" Dec 11 05:33:17 crc kubenswrapper[4628]: I1211 05:33:17.955605 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8161e0eb-57ee-447e-9427-2e93432ff767" path="/var/lib/kubelet/pods/8161e0eb-57ee-447e-9427-2e93432ff767/volumes" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.189231 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.292454 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-config-data-custom\") pod \"82198259-8146-4ae9-b228-60a530d47fa0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.292638 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-combined-ca-bundle\") pod \"82198259-8146-4ae9-b228-60a530d47fa0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.292679 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-scripts\") pod \"82198259-8146-4ae9-b228-60a530d47fa0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.292705 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82198259-8146-4ae9-b228-60a530d47fa0-etc-machine-id\") pod \"82198259-8146-4ae9-b228-60a530d47fa0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.292792 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-config-data\") pod \"82198259-8146-4ae9-b228-60a530d47fa0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.292819 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq9z6\" (UniqueName: \"kubernetes.io/projected/82198259-8146-4ae9-b228-60a530d47fa0-kube-api-access-pq9z6\") pod \"82198259-8146-4ae9-b228-60a530d47fa0\" (UID: \"82198259-8146-4ae9-b228-60a530d47fa0\") " Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.294291 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82198259-8146-4ae9-b228-60a530d47fa0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "82198259-8146-4ae9-b228-60a530d47fa0" (UID: "82198259-8146-4ae9-b228-60a530d47fa0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.300871 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-scripts" (OuterVolumeSpecName: "scripts") pod "82198259-8146-4ae9-b228-60a530d47fa0" (UID: "82198259-8146-4ae9-b228-60a530d47fa0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.319349 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "82198259-8146-4ae9-b228-60a530d47fa0" (UID: "82198259-8146-4ae9-b228-60a530d47fa0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.320315 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82198259-8146-4ae9-b228-60a530d47fa0-kube-api-access-pq9z6" (OuterVolumeSpecName: "kube-api-access-pq9z6") pod "82198259-8146-4ae9-b228-60a530d47fa0" (UID: "82198259-8146-4ae9-b228-60a530d47fa0"). InnerVolumeSpecName "kube-api-access-pq9z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.392408 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82198259-8146-4ae9-b228-60a530d47fa0" (UID: "82198259-8146-4ae9-b228-60a530d47fa0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.394415 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.394437 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.394448 4628 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82198259-8146-4ae9-b228-60a530d47fa0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.394457 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq9z6\" (UniqueName: \"kubernetes.io/projected/82198259-8146-4ae9-b228-60a530d47fa0-kube-api-access-pq9z6\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.394466 4628 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.444813 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-config-data" (OuterVolumeSpecName: "config-data") pod "82198259-8146-4ae9-b228-60a530d47fa0" (UID: "82198259-8146-4ae9-b228-60a530d47fa0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.496291 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82198259-8146-4ae9-b228-60a530d47fa0-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.641824 4628 generic.go:334] "Generic (PLEG): container finished" podID="82198259-8146-4ae9-b228-60a530d47fa0" containerID="ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1" exitCode=0 Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.641888 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82198259-8146-4ae9-b228-60a530d47fa0","Type":"ContainerDied","Data":"ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1"} Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.641919 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82198259-8146-4ae9-b228-60a530d47fa0","Type":"ContainerDied","Data":"1133e0e0abfb182247c00c0a59c6451a7dd3051513023e09a292f0c26bd4a166"} Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.641939 4628 scope.go:117] "RemoveContainer" containerID="8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.642080 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.671974 4628 scope.go:117] "RemoveContainer" containerID="ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.677898 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.688674 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.721132 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 05:33:19 crc kubenswrapper[4628]: E1211 05:33:19.721583 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8161e0eb-57ee-447e-9427-2e93432ff767" containerName="dnsmasq-dns" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.721608 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="8161e0eb-57ee-447e-9427-2e93432ff767" containerName="dnsmasq-dns" Dec 11 05:33:19 crc kubenswrapper[4628]: E1211 05:33:19.721646 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82198259-8146-4ae9-b228-60a530d47fa0" containerName="cinder-scheduler" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.721655 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="82198259-8146-4ae9-b228-60a530d47fa0" containerName="cinder-scheduler" Dec 11 05:33:19 crc kubenswrapper[4628]: E1211 05:33:19.721668 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8161e0eb-57ee-447e-9427-2e93432ff767" containerName="init" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.721675 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="8161e0eb-57ee-447e-9427-2e93432ff767" containerName="init" Dec 11 05:33:19 crc kubenswrapper[4628]: E1211 05:33:19.721686 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82198259-8146-4ae9-b228-60a530d47fa0" containerName="probe" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.721696 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="82198259-8146-4ae9-b228-60a530d47fa0" containerName="probe" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.721924 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="82198259-8146-4ae9-b228-60a530d47fa0" containerName="probe" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.721947 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="82198259-8146-4ae9-b228-60a530d47fa0" containerName="cinder-scheduler" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.721960 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="8161e0eb-57ee-447e-9427-2e93432ff767" containerName="dnsmasq-dns" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.723131 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.740316 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.743024 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.774065 4628 scope.go:117] "RemoveContainer" containerID="8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73" Dec 11 05:33:19 crc kubenswrapper[4628]: E1211 05:33:19.774706 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73\": container with ID starting with 8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73 not found: ID does not exist" containerID="8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.774735 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73"} err="failed to get container status \"8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73\": rpc error: code = NotFound desc = could not find container \"8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73\": container with ID starting with 8e74bd3d61e1f79b0d7d71950d796da760cb55489c28ecee120a80bd591a7f73 not found: ID does not exist" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.774758 4628 scope.go:117] "RemoveContainer" containerID="ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1" Dec 11 05:33:19 crc kubenswrapper[4628]: E1211 05:33:19.775032 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1\": container with ID starting with ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1 not found: ID does not exist" containerID="ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.775054 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1"} err="failed to get container status \"ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1\": rpc error: code = NotFound desc = could not find container \"ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1\": container with ID starting with ac03b7b89798c007406fec3b6345e3f49c9030e3cab1eda1d1939e586ec562e1 not found: ID does not exist" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.805989 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6f8836-27ef-4cbd-aed1-0949861716db-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.806309 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6f8836-27ef-4cbd-aed1-0949861716db-scripts\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.806460 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6f8836-27ef-4cbd-aed1-0949861716db-config-data\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.806708 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df6f8836-27ef-4cbd-aed1-0949861716db-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.806986 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df6f8836-27ef-4cbd-aed1-0949861716db-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.807091 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxhr5\" (UniqueName: \"kubernetes.io/projected/df6f8836-27ef-4cbd-aed1-0949861716db-kube-api-access-qxhr5\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.898010 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82198259-8146-4ae9-b228-60a530d47fa0" path="/var/lib/kubelet/pods/82198259-8146-4ae9-b228-60a530d47fa0/volumes" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.908336 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxhr5\" (UniqueName: \"kubernetes.io/projected/df6f8836-27ef-4cbd-aed1-0949861716db-kube-api-access-qxhr5\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.908629 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6f8836-27ef-4cbd-aed1-0949861716db-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.908716 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6f8836-27ef-4cbd-aed1-0949861716db-scripts\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.908816 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6f8836-27ef-4cbd-aed1-0949861716db-config-data\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.908930 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df6f8836-27ef-4cbd-aed1-0949861716db-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.909091 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df6f8836-27ef-4cbd-aed1-0949861716db-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.909178 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df6f8836-27ef-4cbd-aed1-0949861716db-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.912934 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6f8836-27ef-4cbd-aed1-0949861716db-scripts\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.913470 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df6f8836-27ef-4cbd-aed1-0949861716db-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.921446 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6f8836-27ef-4cbd-aed1-0949861716db-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.923051 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6f8836-27ef-4cbd-aed1-0949861716db-config-data\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:19 crc kubenswrapper[4628]: I1211 05:33:19.930035 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxhr5\" (UniqueName: \"kubernetes.io/projected/df6f8836-27ef-4cbd-aed1-0949861716db-kube-api-access-qxhr5\") pod \"cinder-scheduler-0\" (UID: \"df6f8836-27ef-4cbd-aed1-0949861716db\") " pod="openstack/cinder-scheduler-0" Dec 11 05:33:20 crc kubenswrapper[4628]: I1211 05:33:20.098585 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 11 05:33:20 crc kubenswrapper[4628]: I1211 05:33:20.614230 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 11 05:33:20 crc kubenswrapper[4628]: I1211 05:33:20.650523 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df6f8836-27ef-4cbd-aed1-0949861716db","Type":"ContainerStarted","Data":"c912e0f27420a0ec4b3a1900bb7c2ebd08b95bdf54da03aca7a20c276e549865"} Dec 11 05:33:21 crc kubenswrapper[4628]: I1211 05:33:21.664883 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df6f8836-27ef-4cbd-aed1-0949861716db","Type":"ContainerStarted","Data":"f2525228f4da0ce3075ac70f6b7a69b44bf0e88d73ac47c23084b9ddb17a0964"} Dec 11 05:33:21 crc kubenswrapper[4628]: I1211 05:33:21.738726 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 11 05:33:22 crc kubenswrapper[4628]: I1211 05:33:22.674132 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df6f8836-27ef-4cbd-aed1-0949861716db","Type":"ContainerStarted","Data":"eb0f9ac813d826f1b20fcaedab1fb55694241ee926df61d008e90d58e5e3a97a"} Dec 11 05:33:22 crc kubenswrapper[4628]: I1211 05:33:22.703874 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.703835227 podStartE2EDuration="3.703835227s" podCreationTimestamp="2025-12-11 05:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:33:22.695546222 +0000 UTC m=+1105.112892930" watchObservedRunningTime="2025-12-11 05:33:22.703835227 +0000 UTC m=+1105.121181925" Dec 11 05:33:22 crc kubenswrapper[4628]: I1211 05:33:22.834390 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 11 05:33:22 crc kubenswrapper[4628]: I1211 05:33:22.835533 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 11 05:33:22 crc kubenswrapper[4628]: I1211 05:33:22.837628 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 11 05:33:22 crc kubenswrapper[4628]: I1211 05:33:22.837630 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 11 05:33:22 crc kubenswrapper[4628]: I1211 05:33:22.837620 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-p7cz4" Dec 11 05:33:22 crc kubenswrapper[4628]: I1211 05:33:22.847239 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 11 05:33:22 crc kubenswrapper[4628]: I1211 05:33:22.998558 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75983d64-ba11-4ef7-a433-34863bd80b58-combined-ca-bundle\") pod \"openstackclient\" (UID: \"75983d64-ba11-4ef7-a433-34863bd80b58\") " pod="openstack/openstackclient" Dec 11 05:33:22 crc kubenswrapper[4628]: I1211 05:33:22.998614 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmp9j\" (UniqueName: \"kubernetes.io/projected/75983d64-ba11-4ef7-a433-34863bd80b58-kube-api-access-nmp9j\") pod \"openstackclient\" (UID: \"75983d64-ba11-4ef7-a433-34863bd80b58\") " pod="openstack/openstackclient" Dec 11 05:33:22 crc kubenswrapper[4628]: I1211 05:33:22.998773 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75983d64-ba11-4ef7-a433-34863bd80b58-openstack-config-secret\") pod \"openstackclient\" (UID: \"75983d64-ba11-4ef7-a433-34863bd80b58\") " pod="openstack/openstackclient" Dec 11 05:33:22 crc kubenswrapper[4628]: I1211 05:33:22.999066 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75983d64-ba11-4ef7-a433-34863bd80b58-openstack-config\") pod \"openstackclient\" (UID: \"75983d64-ba11-4ef7-a433-34863bd80b58\") " pod="openstack/openstackclient" Dec 11 05:33:23 crc kubenswrapper[4628]: I1211 05:33:23.100221 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75983d64-ba11-4ef7-a433-34863bd80b58-combined-ca-bundle\") pod \"openstackclient\" (UID: \"75983d64-ba11-4ef7-a433-34863bd80b58\") " pod="openstack/openstackclient" Dec 11 05:33:23 crc kubenswrapper[4628]: I1211 05:33:23.100278 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmp9j\" (UniqueName: \"kubernetes.io/projected/75983d64-ba11-4ef7-a433-34863bd80b58-kube-api-access-nmp9j\") pod \"openstackclient\" (UID: \"75983d64-ba11-4ef7-a433-34863bd80b58\") " pod="openstack/openstackclient" Dec 11 05:33:23 crc kubenswrapper[4628]: I1211 05:33:23.100320 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75983d64-ba11-4ef7-a433-34863bd80b58-openstack-config-secret\") pod \"openstackclient\" (UID: \"75983d64-ba11-4ef7-a433-34863bd80b58\") " pod="openstack/openstackclient" Dec 11 05:33:23 crc kubenswrapper[4628]: I1211 05:33:23.100395 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75983d64-ba11-4ef7-a433-34863bd80b58-openstack-config\") pod \"openstackclient\" (UID: \"75983d64-ba11-4ef7-a433-34863bd80b58\") " pod="openstack/openstackclient" Dec 11 05:33:23 crc kubenswrapper[4628]: I1211 05:33:23.101162 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75983d64-ba11-4ef7-a433-34863bd80b58-openstack-config\") pod \"openstackclient\" (UID: \"75983d64-ba11-4ef7-a433-34863bd80b58\") " pod="openstack/openstackclient" Dec 11 05:33:23 crc kubenswrapper[4628]: I1211 05:33:23.107974 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75983d64-ba11-4ef7-a433-34863bd80b58-combined-ca-bundle\") pod \"openstackclient\" (UID: \"75983d64-ba11-4ef7-a433-34863bd80b58\") " pod="openstack/openstackclient" Dec 11 05:33:23 crc kubenswrapper[4628]: I1211 05:33:23.108385 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75983d64-ba11-4ef7-a433-34863bd80b58-openstack-config-secret\") pod \"openstackclient\" (UID: \"75983d64-ba11-4ef7-a433-34863bd80b58\") " pod="openstack/openstackclient" Dec 11 05:33:23 crc kubenswrapper[4628]: I1211 05:33:23.127060 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmp9j\" (UniqueName: \"kubernetes.io/projected/75983d64-ba11-4ef7-a433-34863bd80b58-kube-api-access-nmp9j\") pod \"openstackclient\" (UID: \"75983d64-ba11-4ef7-a433-34863bd80b58\") " pod="openstack/openstackclient" Dec 11 05:33:23 crc kubenswrapper[4628]: I1211 05:33:23.153212 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 11 05:33:23 crc kubenswrapper[4628]: I1211 05:33:23.693566 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.691790 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"75983d64-ba11-4ef7-a433-34863bd80b58","Type":"ContainerStarted","Data":"fcfe81982e668b741f18ca65ae60d41674957a3bd477dfbe6043358e0011171b"} Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.756883 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7588f48d9f-5vkfm"] Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.758319 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.760788 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.760878 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.760981 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.820653 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7588f48d9f-5vkfm"] Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.932203 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87045d6-b3bc-468e-8121-1023f3f30de0-internal-tls-certs\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.932520 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b87045d6-b3bc-468e-8121-1023f3f30de0-log-httpd\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.932548 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87045d6-b3bc-468e-8121-1023f3f30de0-public-tls-certs\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.932567 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b87045d6-b3bc-468e-8121-1023f3f30de0-run-httpd\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.932599 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b87045d6-b3bc-468e-8121-1023f3f30de0-etc-swift\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.932652 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6vjk\" (UniqueName: \"kubernetes.io/projected/b87045d6-b3bc-468e-8121-1023f3f30de0-kube-api-access-s6vjk\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.932675 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87045d6-b3bc-468e-8121-1023f3f30de0-config-data\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:24 crc kubenswrapper[4628]: I1211 05:33:24.932729 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87045d6-b3bc-468e-8121-1023f3f30de0-combined-ca-bundle\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.035304 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b87045d6-b3bc-468e-8121-1023f3f30de0-run-httpd\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.035397 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b87045d6-b3bc-468e-8121-1023f3f30de0-etc-swift\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.035474 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6vjk\" (UniqueName: \"kubernetes.io/projected/b87045d6-b3bc-468e-8121-1023f3f30de0-kube-api-access-s6vjk\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.035538 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87045d6-b3bc-468e-8121-1023f3f30de0-config-data\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.035643 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87045d6-b3bc-468e-8121-1023f3f30de0-combined-ca-bundle\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.035749 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87045d6-b3bc-468e-8121-1023f3f30de0-internal-tls-certs\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.035824 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b87045d6-b3bc-468e-8121-1023f3f30de0-log-httpd\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.036017 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87045d6-b3bc-468e-8121-1023f3f30de0-public-tls-certs\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.037866 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b87045d6-b3bc-468e-8121-1023f3f30de0-run-httpd\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.041730 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b87045d6-b3bc-468e-8121-1023f3f30de0-log-httpd\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.057545 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87045d6-b3bc-468e-8121-1023f3f30de0-internal-tls-certs\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.057548 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87045d6-b3bc-468e-8121-1023f3f30de0-combined-ca-bundle\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.059017 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87045d6-b3bc-468e-8121-1023f3f30de0-config-data\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.059564 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6vjk\" (UniqueName: \"kubernetes.io/projected/b87045d6-b3bc-468e-8121-1023f3f30de0-kube-api-access-s6vjk\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.088509 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87045d6-b3bc-468e-8121-1023f3f30de0-public-tls-certs\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.088622 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b87045d6-b3bc-468e-8121-1023f3f30de0-etc-swift\") pod \"swift-proxy-7588f48d9f-5vkfm\" (UID: \"b87045d6-b3bc-468e-8121-1023f3f30de0\") " pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.104722 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.106168 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66bdd9d8cd-mgd96" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.379201 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:25 crc kubenswrapper[4628]: I1211 05:33:25.932933 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7588f48d9f-5vkfm"] Dec 11 05:33:26 crc kubenswrapper[4628]: I1211 05:33:26.715283 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7588f48d9f-5vkfm" event={"ID":"b87045d6-b3bc-468e-8121-1023f3f30de0","Type":"ContainerStarted","Data":"49a69eecd85b8dcaa85a2a4ee2ef1abf7c142ca229a66e7689ec283d5d5fedea"} Dec 11 05:33:26 crc kubenswrapper[4628]: I1211 05:33:26.715704 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7588f48d9f-5vkfm" event={"ID":"b87045d6-b3bc-468e-8121-1023f3f30de0","Type":"ContainerStarted","Data":"7dfe82d6f81cc0d9df3fc0dd33c3b670947504f4e607a4a978774f7e7d14eec6"} Dec 11 05:33:27 crc kubenswrapper[4628]: I1211 05:33:27.718483 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:27 crc kubenswrapper[4628]: I1211 05:33:27.719710 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="ceilometer-central-agent" containerID="cri-o://da54bb72b38f7fcc13d1e5bdf98a7d0686be9540c61b7c1bdb57246f63a052cb" gracePeriod=30 Dec 11 05:33:27 crc kubenswrapper[4628]: I1211 05:33:27.719902 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="ceilometer-notification-agent" containerID="cri-o://fb604958834fbf7db528a4e60e454b36db0e55acdc4a06445a5e1e564d4ec0ac" gracePeriod=30 Dec 11 05:33:27 crc kubenswrapper[4628]: I1211 05:33:27.719773 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="sg-core" containerID="cri-o://7d659aa18233c021f160afa14611f38a271bc5e040b9dabc7146d79882b3cf37" gracePeriod=30 Dec 11 05:33:27 crc kubenswrapper[4628]: I1211 05:33:27.719742 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="proxy-httpd" containerID="cri-o://734c8007c302e5b056bf574f64f824b53a62f23ebe182eae5a2fe4a5146e4cc5" gracePeriod=30 Dec 11 05:33:27 crc kubenswrapper[4628]: I1211 05:33:27.727742 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7588f48d9f-5vkfm" event={"ID":"b87045d6-b3bc-468e-8121-1023f3f30de0","Type":"ContainerStarted","Data":"f229684cb52ac15bc660695abf75d0f9af122668e185998cd5a37af3cebeca21"} Dec 11 05:33:27 crc kubenswrapper[4628]: I1211 05:33:27.728892 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:27 crc kubenswrapper[4628]: I1211 05:33:27.728922 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:27 crc kubenswrapper[4628]: I1211 05:33:27.729435 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 11 05:33:27 crc kubenswrapper[4628]: I1211 05:33:27.755708 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7588f48d9f-5vkfm" podStartSLOduration=3.755646081 podStartE2EDuration="3.755646081s" podCreationTimestamp="2025-12-11 05:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:33:27.753305078 +0000 UTC m=+1110.170651776" watchObservedRunningTime="2025-12-11 05:33:27.755646081 +0000 UTC m=+1110.172992779" Dec 11 05:33:28 crc kubenswrapper[4628]: I1211 05:33:28.738159 4628 generic.go:334] "Generic (PLEG): container finished" podID="872564c3-1be6-474b-907e-6527469f6e91" containerID="734c8007c302e5b056bf574f64f824b53a62f23ebe182eae5a2fe4a5146e4cc5" exitCode=0 Dec 11 05:33:28 crc kubenswrapper[4628]: I1211 05:33:28.738511 4628 generic.go:334] "Generic (PLEG): container finished" podID="872564c3-1be6-474b-907e-6527469f6e91" containerID="7d659aa18233c021f160afa14611f38a271bc5e040b9dabc7146d79882b3cf37" exitCode=2 Dec 11 05:33:28 crc kubenswrapper[4628]: I1211 05:33:28.738520 4628 generic.go:334] "Generic (PLEG): container finished" podID="872564c3-1be6-474b-907e-6527469f6e91" containerID="da54bb72b38f7fcc13d1e5bdf98a7d0686be9540c61b7c1bdb57246f63a052cb" exitCode=0 Dec 11 05:33:28 crc kubenswrapper[4628]: I1211 05:33:28.738478 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"872564c3-1be6-474b-907e-6527469f6e91","Type":"ContainerDied","Data":"734c8007c302e5b056bf574f64f824b53a62f23ebe182eae5a2fe4a5146e4cc5"} Dec 11 05:33:28 crc kubenswrapper[4628]: I1211 05:33:28.738631 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"872564c3-1be6-474b-907e-6527469f6e91","Type":"ContainerDied","Data":"7d659aa18233c021f160afa14611f38a271bc5e040b9dabc7146d79882b3cf37"} Dec 11 05:33:28 crc kubenswrapper[4628]: I1211 05:33:28.738641 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"872564c3-1be6-474b-907e-6527469f6e91","Type":"ContainerDied","Data":"da54bb72b38f7fcc13d1e5bdf98a7d0686be9540c61b7c1bdb57246f63a052cb"} Dec 11 05:33:30 crc kubenswrapper[4628]: I1211 05:33:30.331759 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 11 05:33:30 crc kubenswrapper[4628]: I1211 05:33:30.760953 4628 generic.go:334] "Generic (PLEG): container finished" podID="872564c3-1be6-474b-907e-6527469f6e91" containerID="fb604958834fbf7db528a4e60e454b36db0e55acdc4a06445a5e1e564d4ec0ac" exitCode=0 Dec 11 05:33:30 crc kubenswrapper[4628]: I1211 05:33:30.761012 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"872564c3-1be6-474b-907e-6527469f6e91","Type":"ContainerDied","Data":"fb604958834fbf7db528a4e60e454b36db0e55acdc4a06445a5e1e564d4ec0ac"} Dec 11 05:33:31 crc kubenswrapper[4628]: I1211 05:33:31.427663 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:33:31 crc kubenswrapper[4628]: I1211 05:33:31.427718 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:33:35 crc kubenswrapper[4628]: I1211 05:33:35.093498 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66bdd9d8cd-mgd96" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Dec 11 05:33:35 crc kubenswrapper[4628]: I1211 05:33:35.094229 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:33:35 crc kubenswrapper[4628]: I1211 05:33:35.437871 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:35 crc kubenswrapper[4628]: I1211 05:33:35.447397 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7588f48d9f-5vkfm" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.290488 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.375039 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-combined-ca-bundle\") pod \"872564c3-1be6-474b-907e-6527469f6e91\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.375257 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-scripts\") pod \"872564c3-1be6-474b-907e-6527469f6e91\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.375297 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/872564c3-1be6-474b-907e-6527469f6e91-log-httpd\") pod \"872564c3-1be6-474b-907e-6527469f6e91\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.375901 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/872564c3-1be6-474b-907e-6527469f6e91-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "872564c3-1be6-474b-907e-6527469f6e91" (UID: "872564c3-1be6-474b-907e-6527469f6e91"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.376407 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-sg-core-conf-yaml\") pod \"872564c3-1be6-474b-907e-6527469f6e91\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.376454 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/872564c3-1be6-474b-907e-6527469f6e91-run-httpd\") pod \"872564c3-1be6-474b-907e-6527469f6e91\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.376481 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-config-data\") pod \"872564c3-1be6-474b-907e-6527469f6e91\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.376586 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d6jp\" (UniqueName: \"kubernetes.io/projected/872564c3-1be6-474b-907e-6527469f6e91-kube-api-access-9d6jp\") pod \"872564c3-1be6-474b-907e-6527469f6e91\" (UID: \"872564c3-1be6-474b-907e-6527469f6e91\") " Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.376943 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/872564c3-1be6-474b-907e-6527469f6e91-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "872564c3-1be6-474b-907e-6527469f6e91" (UID: "872564c3-1be6-474b-907e-6527469f6e91"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.377186 4628 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/872564c3-1be6-474b-907e-6527469f6e91-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.377199 4628 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/872564c3-1be6-474b-907e-6527469f6e91-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.380472 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-scripts" (OuterVolumeSpecName: "scripts") pod "872564c3-1be6-474b-907e-6527469f6e91" (UID: "872564c3-1be6-474b-907e-6527469f6e91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.387248 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/872564c3-1be6-474b-907e-6527469f6e91-kube-api-access-9d6jp" (OuterVolumeSpecName: "kube-api-access-9d6jp") pod "872564c3-1be6-474b-907e-6527469f6e91" (UID: "872564c3-1be6-474b-907e-6527469f6e91"). InnerVolumeSpecName "kube-api-access-9d6jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.415440 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "872564c3-1be6-474b-907e-6527469f6e91" (UID: "872564c3-1be6-474b-907e-6527469f6e91"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.464030 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "872564c3-1be6-474b-907e-6527469f6e91" (UID: "872564c3-1be6-474b-907e-6527469f6e91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.479143 4628 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.479176 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d6jp\" (UniqueName: \"kubernetes.io/projected/872564c3-1be6-474b-907e-6527469f6e91-kube-api-access-9d6jp\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.479187 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.479196 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.495036 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-config-data" (OuterVolumeSpecName: "config-data") pod "872564c3-1be6-474b-907e-6527469f6e91" (UID: "872564c3-1be6-474b-907e-6527469f6e91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.580528 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872564c3-1be6-474b-907e-6527469f6e91-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.812945 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"872564c3-1be6-474b-907e-6527469f6e91","Type":"ContainerDied","Data":"61e61f92a6b857821f851ba628074486257ede50ab9470770310ad9fd4ef9d32"} Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.813039 4628 scope.go:117] "RemoveContainer" containerID="734c8007c302e5b056bf574f64f824b53a62f23ebe182eae5a2fe4a5146e4cc5" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.813268 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.814378 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"75983d64-ba11-4ef7-a433-34863bd80b58","Type":"ContainerStarted","Data":"8f9a93af15f6cd77ebd08b8f8bc907610631246f427d8f8db02195e664d7b4df"} Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.843663 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.511058482 podStartE2EDuration="14.843647177s" podCreationTimestamp="2025-12-11 05:33:22 +0000 UTC" firstStartedPulling="2025-12-11 05:33:23.716973452 +0000 UTC m=+1106.134320150" lastFinishedPulling="2025-12-11 05:33:36.049562147 +0000 UTC m=+1118.466908845" observedRunningTime="2025-12-11 05:33:36.835362441 +0000 UTC m=+1119.252709139" watchObservedRunningTime="2025-12-11 05:33:36.843647177 +0000 UTC m=+1119.260993875" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.845736 4628 scope.go:117] "RemoveContainer" containerID="7d659aa18233c021f160afa14611f38a271bc5e040b9dabc7146d79882b3cf37" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.856427 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.864876 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.900986 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:36 crc kubenswrapper[4628]: E1211 05:33:36.901905 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="ceilometer-central-agent" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.901921 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="ceilometer-central-agent" Dec 11 05:33:36 crc kubenswrapper[4628]: E1211 05:33:36.901940 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="sg-core" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.901946 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="sg-core" Dec 11 05:33:36 crc kubenswrapper[4628]: E1211 05:33:36.901956 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="proxy-httpd" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.901964 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="proxy-httpd" Dec 11 05:33:36 crc kubenswrapper[4628]: E1211 05:33:36.901977 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="ceilometer-notification-agent" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.901982 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="ceilometer-notification-agent" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.902160 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="ceilometer-central-agent" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.902178 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="ceilometer-notification-agent" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.902188 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="proxy-httpd" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.902202 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="872564c3-1be6-474b-907e-6527469f6e91" containerName="sg-core" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.914014 4628 scope.go:117] "RemoveContainer" containerID="fb604958834fbf7db528a4e60e454b36db0e55acdc4a06445a5e1e564d4ec0ac" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.944655 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.950701 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.950986 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 05:33:36 crc kubenswrapper[4628]: I1211 05:33:36.967253 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.019165 4628 scope.go:117] "RemoveContainer" containerID="da54bb72b38f7fcc13d1e5bdf98a7d0686be9540c61b7c1bdb57246f63a052cb" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.095763 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-scripts\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.096095 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.096290 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7868h\" (UniqueName: \"kubernetes.io/projected/4c2a1809-7476-46e1-a61e-582dfeb95093-kube-api-access-7868h\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.096386 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c2a1809-7476-46e1-a61e-582dfeb95093-run-httpd\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.096542 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c2a1809-7476-46e1-a61e-582dfeb95093-log-httpd\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.096643 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-config-data\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.096721 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.097330 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:37 crc kubenswrapper[4628]: E1211 05:33:37.098061 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-7868h log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="4c2a1809-7476-46e1-a61e-582dfeb95093" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.127194 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.127674 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8" containerName="kube-state-metrics" containerID="cri-o://56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921" gracePeriod=30 Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.198014 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7868h\" (UniqueName: \"kubernetes.io/projected/4c2a1809-7476-46e1-a61e-582dfeb95093-kube-api-access-7868h\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.198299 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c2a1809-7476-46e1-a61e-582dfeb95093-run-httpd\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.198412 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c2a1809-7476-46e1-a61e-582dfeb95093-log-httpd\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.198485 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-config-data\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.198550 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.198630 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-scripts\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.198725 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.198830 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c2a1809-7476-46e1-a61e-582dfeb95093-run-httpd\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.198911 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c2a1809-7476-46e1-a61e-582dfeb95093-log-httpd\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.203767 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-config-data\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.204139 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-scripts\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.205713 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.216512 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7868h\" (UniqueName: \"kubernetes.io/projected/4c2a1809-7476-46e1-a61e-582dfeb95093-kube-api-access-7868h\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.216674 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.609735 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.707430 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s828\" (UniqueName: \"kubernetes.io/projected/ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8-kube-api-access-5s828\") pod \"ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8\" (UID: \"ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8\") " Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.711925 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8-kube-api-access-5s828" (OuterVolumeSpecName: "kube-api-access-5s828") pod "ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8" (UID: "ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8"). InnerVolumeSpecName "kube-api-access-5s828". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.712337 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s828\" (UniqueName: \"kubernetes.io/projected/ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8-kube-api-access-5s828\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.824362 4628 generic.go:334] "Generic (PLEG): container finished" podID="ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8" containerID="56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921" exitCode=2 Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.824427 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8","Type":"ContainerDied","Data":"56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921"} Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.824495 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8","Type":"ContainerDied","Data":"9dfd751e3ee0674ed5c12bff59eea513dd6a63e500db97130ea4da7e1dd3982c"} Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.824516 4628 scope.go:117] "RemoveContainer" containerID="56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.824455 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.824967 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.833385 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.845874 4628 scope.go:117] "RemoveContainer" containerID="56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921" Dec 11 05:33:37 crc kubenswrapper[4628]: E1211 05:33:37.846315 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921\": container with ID starting with 56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921 not found: ID does not exist" containerID="56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.846419 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921"} err="failed to get container status \"56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921\": rpc error: code = NotFound desc = could not find container \"56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921\": container with ID starting with 56e852587b89b3c4648d688fad2ba0c65c8509bc4fc5d7bf4b074171112fb921 not found: ID does not exist" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.866061 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.914269 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="872564c3-1be6-474b-907e-6527469f6e91" path="/var/lib/kubelet/pods/872564c3-1be6-474b-907e-6527469f6e91/volumes" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.916143 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-scripts\") pod \"4c2a1809-7476-46e1-a61e-582dfeb95093\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.916253 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c2a1809-7476-46e1-a61e-582dfeb95093-run-httpd\") pod \"4c2a1809-7476-46e1-a61e-582dfeb95093\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.916319 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c2a1809-7476-46e1-a61e-582dfeb95093-log-httpd\") pod \"4c2a1809-7476-46e1-a61e-582dfeb95093\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.916398 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7868h\" (UniqueName: \"kubernetes.io/projected/4c2a1809-7476-46e1-a61e-582dfeb95093-kube-api-access-7868h\") pod \"4c2a1809-7476-46e1-a61e-582dfeb95093\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.916462 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-sg-core-conf-yaml\") pod \"4c2a1809-7476-46e1-a61e-582dfeb95093\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.916485 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-config-data\") pod \"4c2a1809-7476-46e1-a61e-582dfeb95093\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.916646 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-combined-ca-bundle\") pod \"4c2a1809-7476-46e1-a61e-582dfeb95093\" (UID: \"4c2a1809-7476-46e1-a61e-582dfeb95093\") " Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.923268 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-scripts" (OuterVolumeSpecName: "scripts") pod "4c2a1809-7476-46e1-a61e-582dfeb95093" (UID: "4c2a1809-7476-46e1-a61e-582dfeb95093"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.923923 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-config-data" (OuterVolumeSpecName: "config-data") pod "4c2a1809-7476-46e1-a61e-582dfeb95093" (UID: "4c2a1809-7476-46e1-a61e-582dfeb95093"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.924149 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c2a1809-7476-46e1-a61e-582dfeb95093-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4c2a1809-7476-46e1-a61e-582dfeb95093" (UID: "4c2a1809-7476-46e1-a61e-582dfeb95093"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.924306 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c2a1809-7476-46e1-a61e-582dfeb95093-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4c2a1809-7476-46e1-a61e-582dfeb95093" (UID: "4c2a1809-7476-46e1-a61e-582dfeb95093"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.930474 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c2a1809-7476-46e1-a61e-582dfeb95093" (UID: "4c2a1809-7476-46e1-a61e-582dfeb95093"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.931108 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.931145 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 05:33:37 crc kubenswrapper[4628]: E1211 05:33:37.931483 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8" containerName="kube-state-metrics" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.931496 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8" containerName="kube-state-metrics" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.931697 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8" containerName="kube-state-metrics" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.932255 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.932339 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.935817 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.935862 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.945912 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4c2a1809-7476-46e1-a61e-582dfeb95093" (UID: "4c2a1809-7476-46e1-a61e-582dfeb95093"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:37 crc kubenswrapper[4628]: I1211 05:33:37.960988 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c2a1809-7476-46e1-a61e-582dfeb95093-kube-api-access-7868h" (OuterVolumeSpecName: "kube-api-access-7868h") pod "4c2a1809-7476-46e1-a61e-582dfeb95093" (UID: "4c2a1809-7476-46e1-a61e-582dfeb95093"). InnerVolumeSpecName "kube-api-access-7868h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.019619 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd99782-66a8-47e8-a4cf-d5f2805655dc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6dd99782-66a8-47e8-a4cf-d5f2805655dc\") " pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.020170 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z78pk\" (UniqueName: \"kubernetes.io/projected/6dd99782-66a8-47e8-a4cf-d5f2805655dc-kube-api-access-z78pk\") pod \"kube-state-metrics-0\" (UID: \"6dd99782-66a8-47e8-a4cf-d5f2805655dc\") " pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.020225 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6dd99782-66a8-47e8-a4cf-d5f2805655dc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6dd99782-66a8-47e8-a4cf-d5f2805655dc\") " pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.021344 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd99782-66a8-47e8-a4cf-d5f2805655dc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6dd99782-66a8-47e8-a4cf-d5f2805655dc\") " pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.021450 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.021470 4628 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c2a1809-7476-46e1-a61e-582dfeb95093-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.021481 4628 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c2a1809-7476-46e1-a61e-582dfeb95093-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.021490 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7868h\" (UniqueName: \"kubernetes.io/projected/4c2a1809-7476-46e1-a61e-582dfeb95093-kube-api-access-7868h\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.021499 4628 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.021509 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.021520 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2a1809-7476-46e1-a61e-582dfeb95093-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.123419 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z78pk\" (UniqueName: \"kubernetes.io/projected/6dd99782-66a8-47e8-a4cf-d5f2805655dc-kube-api-access-z78pk\") pod \"kube-state-metrics-0\" (UID: \"6dd99782-66a8-47e8-a4cf-d5f2805655dc\") " pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.123477 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6dd99782-66a8-47e8-a4cf-d5f2805655dc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6dd99782-66a8-47e8-a4cf-d5f2805655dc\") " pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.124747 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd99782-66a8-47e8-a4cf-d5f2805655dc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6dd99782-66a8-47e8-a4cf-d5f2805655dc\") " pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.124802 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd99782-66a8-47e8-a4cf-d5f2805655dc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6dd99782-66a8-47e8-a4cf-d5f2805655dc\") " pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.127708 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6dd99782-66a8-47e8-a4cf-d5f2805655dc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6dd99782-66a8-47e8-a4cf-d5f2805655dc\") " pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.129140 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd99782-66a8-47e8-a4cf-d5f2805655dc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6dd99782-66a8-47e8-a4cf-d5f2805655dc\") " pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.131424 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd99782-66a8-47e8-a4cf-d5f2805655dc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6dd99782-66a8-47e8-a4cf-d5f2805655dc\") " pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.150029 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z78pk\" (UniqueName: \"kubernetes.io/projected/6dd99782-66a8-47e8-a4cf-d5f2805655dc-kube-api-access-z78pk\") pod \"kube-state-metrics-0\" (UID: \"6dd99782-66a8-47e8-a4cf-d5f2805655dc\") " pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.322614 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 11 05:33:38 crc kubenswrapper[4628]: W1211 05:33:38.815766 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dd99782_66a8_47e8_a4cf_d5f2805655dc.slice/crio-3f577f5e4a44585d6fc0936efe50600d453201ccc847451ad136bcd51afb0536 WatchSource:0}: Error finding container 3f577f5e4a44585d6fc0936efe50600d453201ccc847451ad136bcd51afb0536: Status 404 returned error can't find the container with id 3f577f5e4a44585d6fc0936efe50600d453201ccc847451ad136bcd51afb0536 Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.816345 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.834331 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6dd99782-66a8-47e8-a4cf-d5f2805655dc","Type":"ContainerStarted","Data":"3f577f5e4a44585d6fc0936efe50600d453201ccc847451ad136bcd51afb0536"} Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.834396 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.910573 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.924919 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.937769 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.948675 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.950147 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.952064 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 05:33:38 crc kubenswrapper[4628]: I1211 05:33:38.952495 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.052224 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-log-httpd\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.052280 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.052582 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-run-httpd\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.052733 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-config-data\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.053002 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.053082 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw7fn\" (UniqueName: \"kubernetes.io/projected/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-kube-api-access-mw7fn\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.053630 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-scripts\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.154931 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-config-data\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.155032 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.155061 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw7fn\" (UniqueName: \"kubernetes.io/projected/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-kube-api-access-mw7fn\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.155081 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-scripts\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.155109 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-log-httpd\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.155137 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.155178 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-run-httpd\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.155788 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-run-httpd\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.156652 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-log-httpd\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.165044 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-scripts\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.166379 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.166606 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.168186 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-config-data\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.171696 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw7fn\" (UniqueName: \"kubernetes.io/projected/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-kube-api-access-mw7fn\") pod \"ceilometer-0\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.305045 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.788416 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:39 crc kubenswrapper[4628]: W1211 05:33:39.791385 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode03b06e1_acfd_4eaa_9fb2_04e27b7ac61e.slice/crio-777298802f2aaeecac8086d7ede94b9666712b64a0efdc0cf3809a34ace675fc WatchSource:0}: Error finding container 777298802f2aaeecac8086d7ede94b9666712b64a0efdc0cf3809a34ace675fc: Status 404 returned error can't find the container with id 777298802f2aaeecac8086d7ede94b9666712b64a0efdc0cf3809a34ace675fc Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.829969 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.843727 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6dd99782-66a8-47e8-a4cf-d5f2805655dc","Type":"ContainerStarted","Data":"976276ba0933861677f7558706b9a5a14e60228005b83508c41fdf28a56253e3"} Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.843982 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.845465 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e","Type":"ContainerStarted","Data":"777298802f2aaeecac8086d7ede94b9666712b64a0efdc0cf3809a34ace675fc"} Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.867675 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.463285017 podStartE2EDuration="2.86765916s" podCreationTimestamp="2025-12-11 05:33:37 +0000 UTC" firstStartedPulling="2025-12-11 05:33:38.818343813 +0000 UTC m=+1121.235690511" lastFinishedPulling="2025-12-11 05:33:39.222717956 +0000 UTC m=+1121.640064654" observedRunningTime="2025-12-11 05:33:39.86546197 +0000 UTC m=+1122.282808668" watchObservedRunningTime="2025-12-11 05:33:39.86765916 +0000 UTC m=+1122.285005858" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.902150 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c2a1809-7476-46e1-a61e-582dfeb95093" path="/var/lib/kubelet/pods/4c2a1809-7476-46e1-a61e-582dfeb95093/volumes" Dec 11 05:33:39 crc kubenswrapper[4628]: I1211 05:33:39.902476 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8" path="/var/lib/kubelet/pods/ee4bf2fc-7897-4b0c-a391-7f3c2f712bf8/volumes" Dec 11 05:33:40 crc kubenswrapper[4628]: I1211 05:33:40.853765 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e","Type":"ContainerStarted","Data":"b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96"} Dec 11 05:33:41 crc kubenswrapper[4628]: W1211 05:33:41.565977 4628 container.go:586] Failed to update stats for container "/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3522a5_42e8_46ba_b794_d23582baa2a4.slice/crio-3e1b4981b4e3cf13b4acdc8cdbcd9a33f424e83ef5decc99c1a62f09e96cd81d": error while statting cgroup v2: [unable to parse /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3522a5_42e8_46ba_b794_d23582baa2a4.slice/crio-3e1b4981b4e3cf13b4acdc8cdbcd9a33f424e83ef5decc99c1a62f09e96cd81d/memory.stat: read /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a3522a5_42e8_46ba_b794_d23582baa2a4.slice/crio-3e1b4981b4e3cf13b4acdc8cdbcd9a33f424e83ef5decc99c1a62f09e96cd81d/memory.stat: no such device], continuing to push stats Dec 11 05:33:41 crc kubenswrapper[4628]: I1211 05:33:41.869459 4628 generic.go:334] "Generic (PLEG): container finished" podID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerID="b60d821722cdfc3fd82f6785ddf1b0a7349d9bd58013594052c7fa0d037fb3be" exitCode=137 Dec 11 05:33:41 crc kubenswrapper[4628]: I1211 05:33:41.869875 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66bdd9d8cd-mgd96" event={"ID":"8a3522a5-42e8-46ba-b794-d23582baa2a4","Type":"ContainerDied","Data":"b60d821722cdfc3fd82f6785ddf1b0a7349d9bd58013594052c7fa0d037fb3be"} Dec 11 05:33:41 crc kubenswrapper[4628]: I1211 05:33:41.879530 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e","Type":"ContainerStarted","Data":"9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2"} Dec 11 05:33:41 crc kubenswrapper[4628]: I1211 05:33:41.979981 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.106588 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3522a5-42e8-46ba-b794-d23582baa2a4-logs\") pod \"8a3522a5-42e8-46ba-b794-d23582baa2a4\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.106637 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-horizon-secret-key\") pod \"8a3522a5-42e8-46ba-b794-d23582baa2a4\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.106674 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d5ww\" (UniqueName: \"kubernetes.io/projected/8a3522a5-42e8-46ba-b794-d23582baa2a4-kube-api-access-8d5ww\") pod \"8a3522a5-42e8-46ba-b794-d23582baa2a4\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.106758 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a3522a5-42e8-46ba-b794-d23582baa2a4-scripts\") pod \"8a3522a5-42e8-46ba-b794-d23582baa2a4\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.106862 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-horizon-tls-certs\") pod \"8a3522a5-42e8-46ba-b794-d23582baa2a4\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.106883 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a3522a5-42e8-46ba-b794-d23582baa2a4-config-data\") pod \"8a3522a5-42e8-46ba-b794-d23582baa2a4\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.106934 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-combined-ca-bundle\") pod \"8a3522a5-42e8-46ba-b794-d23582baa2a4\" (UID: \"8a3522a5-42e8-46ba-b794-d23582baa2a4\") " Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.107551 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3522a5-42e8-46ba-b794-d23582baa2a4-logs" (OuterVolumeSpecName: "logs") pod "8a3522a5-42e8-46ba-b794-d23582baa2a4" (UID: "8a3522a5-42e8-46ba-b794-d23582baa2a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.117482 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3522a5-42e8-46ba-b794-d23582baa2a4-kube-api-access-8d5ww" (OuterVolumeSpecName: "kube-api-access-8d5ww") pod "8a3522a5-42e8-46ba-b794-d23582baa2a4" (UID: "8a3522a5-42e8-46ba-b794-d23582baa2a4"). InnerVolumeSpecName "kube-api-access-8d5ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.117935 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8a3522a5-42e8-46ba-b794-d23582baa2a4" (UID: "8a3522a5-42e8-46ba-b794-d23582baa2a4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.140639 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3522a5-42e8-46ba-b794-d23582baa2a4-config-data" (OuterVolumeSpecName: "config-data") pod "8a3522a5-42e8-46ba-b794-d23582baa2a4" (UID: "8a3522a5-42e8-46ba-b794-d23582baa2a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.145949 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3522a5-42e8-46ba-b794-d23582baa2a4-scripts" (OuterVolumeSpecName: "scripts") pod "8a3522a5-42e8-46ba-b794-d23582baa2a4" (UID: "8a3522a5-42e8-46ba-b794-d23582baa2a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.156693 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a3522a5-42e8-46ba-b794-d23582baa2a4" (UID: "8a3522a5-42e8-46ba-b794-d23582baa2a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.181716 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "8a3522a5-42e8-46ba-b794-d23582baa2a4" (UID: "8a3522a5-42e8-46ba-b794-d23582baa2a4"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.209019 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a3522a5-42e8-46ba-b794-d23582baa2a4-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.209059 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.209076 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a3522a5-42e8-46ba-b794-d23582baa2a4-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.209086 4628 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.209098 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d5ww\" (UniqueName: \"kubernetes.io/projected/8a3522a5-42e8-46ba-b794-d23582baa2a4-kube-api-access-8d5ww\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.209111 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a3522a5-42e8-46ba-b794-d23582baa2a4-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.209122 4628 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a3522a5-42e8-46ba-b794-d23582baa2a4-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.896414 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e","Type":"ContainerStarted","Data":"95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8"} Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.899034 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66bdd9d8cd-mgd96" event={"ID":"8a3522a5-42e8-46ba-b794-d23582baa2a4","Type":"ContainerDied","Data":"3e1b4981b4e3cf13b4acdc8cdbcd9a33f424e83ef5decc99c1a62f09e96cd81d"} Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.899067 4628 scope.go:117] "RemoveContainer" containerID="a746c66aade8058642983deeedde27bfe41ebbcf4cc43d9cec8d1a2cd699c9e3" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.899175 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66bdd9d8cd-mgd96" Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.951417 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66bdd9d8cd-mgd96"] Dec 11 05:33:42 crc kubenswrapper[4628]: I1211 05:33:42.971142 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-66bdd9d8cd-mgd96"] Dec 11 05:33:43 crc kubenswrapper[4628]: I1211 05:33:43.104036 4628 scope.go:117] "RemoveContainer" containerID="b60d821722cdfc3fd82f6785ddf1b0a7349d9bd58013594052c7fa0d037fb3be" Dec 11 05:33:43 crc kubenswrapper[4628]: I1211 05:33:43.899681 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" path="/var/lib/kubelet/pods/8a3522a5-42e8-46ba-b794-d23582baa2a4/volumes" Dec 11 05:33:44 crc kubenswrapper[4628]: I1211 05:33:44.918214 4628 generic.go:334] "Generic (PLEG): container finished" podID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerID="30996f796f9465a20b77a461096f57a78888f77c906e9e3a13126f5062293f9d" exitCode=1 Dec 11 05:33:44 crc kubenswrapper[4628]: I1211 05:33:44.918407 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e","Type":"ContainerDied","Data":"30996f796f9465a20b77a461096f57a78888f77c906e9e3a13126f5062293f9d"} Dec 11 05:33:44 crc kubenswrapper[4628]: I1211 05:33:44.918578 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="ceilometer-central-agent" containerID="cri-o://b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96" gracePeriod=30 Dec 11 05:33:44 crc kubenswrapper[4628]: I1211 05:33:44.918905 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="sg-core" containerID="cri-o://95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8" gracePeriod=30 Dec 11 05:33:44 crc kubenswrapper[4628]: I1211 05:33:44.918918 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="ceilometer-notification-agent" containerID="cri-o://9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2" gracePeriod=30 Dec 11 05:33:45 crc kubenswrapper[4628]: I1211 05:33:45.928999 4628 generic.go:334] "Generic (PLEG): container finished" podID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerID="95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8" exitCode=2 Dec 11 05:33:45 crc kubenswrapper[4628]: I1211 05:33:45.929348 4628 generic.go:334] "Generic (PLEG): container finished" podID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerID="9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2" exitCode=0 Dec 11 05:33:45 crc kubenswrapper[4628]: I1211 05:33:45.929076 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e","Type":"ContainerDied","Data":"95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8"} Dec 11 05:33:45 crc kubenswrapper[4628]: I1211 05:33:45.929384 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e","Type":"ContainerDied","Data":"9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2"} Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.531765 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-clb84"] Dec 11 05:33:47 crc kubenswrapper[4628]: E1211 05:33:47.532389 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon-log" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.532402 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon-log" Dec 11 05:33:47 crc kubenswrapper[4628]: E1211 05:33:47.532437 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.532443 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.532603 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon-log" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.532621 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3522a5-42e8-46ba-b794-d23582baa2a4" containerName="horizon" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.533407 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-clb84" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.546584 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-clb84"] Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.608626 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkm9w\" (UniqueName: \"kubernetes.io/projected/bb00fa79-0866-4f48-b001-3c05352e47aa-kube-api-access-qkm9w\") pod \"nova-api-db-create-clb84\" (UID: \"bb00fa79-0866-4f48-b001-3c05352e47aa\") " pod="openstack/nova-api-db-create-clb84" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.608727 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb00fa79-0866-4f48-b001-3c05352e47aa-operator-scripts\") pod \"nova-api-db-create-clb84\" (UID: \"bb00fa79-0866-4f48-b001-3c05352e47aa\") " pod="openstack/nova-api-db-create-clb84" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.630900 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-7ppf8"] Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.632163 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7ppf8" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.637010 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7ppf8"] Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.711566 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q595n\" (UniqueName: \"kubernetes.io/projected/7642b050-2c3e-4a3d-bc5d-b5e007cf316f-kube-api-access-q595n\") pod \"nova-cell0-db-create-7ppf8\" (UID: \"7642b050-2c3e-4a3d-bc5d-b5e007cf316f\") " pod="openstack/nova-cell0-db-create-7ppf8" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.711619 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkm9w\" (UniqueName: \"kubernetes.io/projected/bb00fa79-0866-4f48-b001-3c05352e47aa-kube-api-access-qkm9w\") pod \"nova-api-db-create-clb84\" (UID: \"bb00fa79-0866-4f48-b001-3c05352e47aa\") " pod="openstack/nova-api-db-create-clb84" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.711681 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb00fa79-0866-4f48-b001-3c05352e47aa-operator-scripts\") pod \"nova-api-db-create-clb84\" (UID: \"bb00fa79-0866-4f48-b001-3c05352e47aa\") " pod="openstack/nova-api-db-create-clb84" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.711713 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7642b050-2c3e-4a3d-bc5d-b5e007cf316f-operator-scripts\") pod \"nova-cell0-db-create-7ppf8\" (UID: \"7642b050-2c3e-4a3d-bc5d-b5e007cf316f\") " pod="openstack/nova-cell0-db-create-7ppf8" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.714923 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb00fa79-0866-4f48-b001-3c05352e47aa-operator-scripts\") pod \"nova-api-db-create-clb84\" (UID: \"bb00fa79-0866-4f48-b001-3c05352e47aa\") " pod="openstack/nova-api-db-create-clb84" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.747717 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkm9w\" (UniqueName: \"kubernetes.io/projected/bb00fa79-0866-4f48-b001-3c05352e47aa-kube-api-access-qkm9w\") pod \"nova-api-db-create-clb84\" (UID: \"bb00fa79-0866-4f48-b001-3c05352e47aa\") " pod="openstack/nova-api-db-create-clb84" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.774623 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-024a-account-create-update-b9tx5"] Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.775783 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-024a-account-create-update-b9tx5" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.782704 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.783298 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-024a-account-create-update-b9tx5"] Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.816026 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q595n\" (UniqueName: \"kubernetes.io/projected/7642b050-2c3e-4a3d-bc5d-b5e007cf316f-kube-api-access-q595n\") pod \"nova-cell0-db-create-7ppf8\" (UID: \"7642b050-2c3e-4a3d-bc5d-b5e007cf316f\") " pod="openstack/nova-cell0-db-create-7ppf8" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.816327 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7642b050-2c3e-4a3d-bc5d-b5e007cf316f-operator-scripts\") pod \"nova-cell0-db-create-7ppf8\" (UID: \"7642b050-2c3e-4a3d-bc5d-b5e007cf316f\") " pod="openstack/nova-cell0-db-create-7ppf8" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.817474 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7642b050-2c3e-4a3d-bc5d-b5e007cf316f-operator-scripts\") pod \"nova-cell0-db-create-7ppf8\" (UID: \"7642b050-2c3e-4a3d-bc5d-b5e007cf316f\") " pod="openstack/nova-cell0-db-create-7ppf8" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.837411 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-csbvp"] Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.839261 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-csbvp" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.869128 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q595n\" (UniqueName: \"kubernetes.io/projected/7642b050-2c3e-4a3d-bc5d-b5e007cf316f-kube-api-access-q595n\") pod \"nova-cell0-db-create-7ppf8\" (UID: \"7642b050-2c3e-4a3d-bc5d-b5e007cf316f\") " pod="openstack/nova-cell0-db-create-7ppf8" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.872295 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-csbvp"] Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.891163 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-clb84" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.922540 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b-operator-scripts\") pod \"nova-api-024a-account-create-update-b9tx5\" (UID: \"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b\") " pod="openstack/nova-api-024a-account-create-update-b9tx5" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.922589 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86mcs\" (UniqueName: \"kubernetes.io/projected/c3eb6441-7841-43ef-9036-08e2b3d43ed2-kube-api-access-86mcs\") pod \"nova-cell1-db-create-csbvp\" (UID: \"c3eb6441-7841-43ef-9036-08e2b3d43ed2\") " pod="openstack/nova-cell1-db-create-csbvp" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.922610 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3eb6441-7841-43ef-9036-08e2b3d43ed2-operator-scripts\") pod \"nova-cell1-db-create-csbvp\" (UID: \"c3eb6441-7841-43ef-9036-08e2b3d43ed2\") " pod="openstack/nova-cell1-db-create-csbvp" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.934607 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxxzl\" (UniqueName: \"kubernetes.io/projected/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b-kube-api-access-kxxzl\") pod \"nova-api-024a-account-create-update-b9tx5\" (UID: \"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b\") " pod="openstack/nova-api-024a-account-create-update-b9tx5" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.936814 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5519-account-create-update-nfplp"] Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.939185 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5519-account-create-update-nfplp" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.948290 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7ppf8" Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.950005 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5519-account-create-update-nfplp"] Dec 11 05:33:47 crc kubenswrapper[4628]: I1211 05:33:47.978233 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.036349 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v54x\" (UniqueName: \"kubernetes.io/projected/d4dc7021-e4d0-4791-9760-4056d74989ad-kube-api-access-5v54x\") pod \"nova-cell0-5519-account-create-update-nfplp\" (UID: \"d4dc7021-e4d0-4791-9760-4056d74989ad\") " pod="openstack/nova-cell0-5519-account-create-update-nfplp" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.036592 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b-operator-scripts\") pod \"nova-api-024a-account-create-update-b9tx5\" (UID: \"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b\") " pod="openstack/nova-api-024a-account-create-update-b9tx5" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.036649 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86mcs\" (UniqueName: \"kubernetes.io/projected/c3eb6441-7841-43ef-9036-08e2b3d43ed2-kube-api-access-86mcs\") pod \"nova-cell1-db-create-csbvp\" (UID: \"c3eb6441-7841-43ef-9036-08e2b3d43ed2\") " pod="openstack/nova-cell1-db-create-csbvp" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.036667 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3eb6441-7841-43ef-9036-08e2b3d43ed2-operator-scripts\") pod \"nova-cell1-db-create-csbvp\" (UID: \"c3eb6441-7841-43ef-9036-08e2b3d43ed2\") " pod="openstack/nova-cell1-db-create-csbvp" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.036803 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxxzl\" (UniqueName: \"kubernetes.io/projected/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b-kube-api-access-kxxzl\") pod \"nova-api-024a-account-create-update-b9tx5\" (UID: \"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b\") " pod="openstack/nova-api-024a-account-create-update-b9tx5" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.037146 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4dc7021-e4d0-4791-9760-4056d74989ad-operator-scripts\") pod \"nova-cell0-5519-account-create-update-nfplp\" (UID: \"d4dc7021-e4d0-4791-9760-4056d74989ad\") " pod="openstack/nova-cell0-5519-account-create-update-nfplp" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.037767 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b-operator-scripts\") pod \"nova-api-024a-account-create-update-b9tx5\" (UID: \"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b\") " pod="openstack/nova-api-024a-account-create-update-b9tx5" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.039076 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3eb6441-7841-43ef-9036-08e2b3d43ed2-operator-scripts\") pod \"nova-cell1-db-create-csbvp\" (UID: \"c3eb6441-7841-43ef-9036-08e2b3d43ed2\") " pod="openstack/nova-cell1-db-create-csbvp" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.057186 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxxzl\" (UniqueName: \"kubernetes.io/projected/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b-kube-api-access-kxxzl\") pod \"nova-api-024a-account-create-update-b9tx5\" (UID: \"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b\") " pod="openstack/nova-api-024a-account-create-update-b9tx5" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.062333 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86mcs\" (UniqueName: \"kubernetes.io/projected/c3eb6441-7841-43ef-9036-08e2b3d43ed2-kube-api-access-86mcs\") pod \"nova-cell1-db-create-csbvp\" (UID: \"c3eb6441-7841-43ef-9036-08e2b3d43ed2\") " pod="openstack/nova-cell1-db-create-csbvp" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.120456 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-024a-account-create-update-b9tx5" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.140882 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v54x\" (UniqueName: \"kubernetes.io/projected/d4dc7021-e4d0-4791-9760-4056d74989ad-kube-api-access-5v54x\") pod \"nova-cell0-5519-account-create-update-nfplp\" (UID: \"d4dc7021-e4d0-4791-9760-4056d74989ad\") " pod="openstack/nova-cell0-5519-account-create-update-nfplp" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.141113 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4dc7021-e4d0-4791-9760-4056d74989ad-operator-scripts\") pod \"nova-cell0-5519-account-create-update-nfplp\" (UID: \"d4dc7021-e4d0-4791-9760-4056d74989ad\") " pod="openstack/nova-cell0-5519-account-create-update-nfplp" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.145282 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4dc7021-e4d0-4791-9760-4056d74989ad-operator-scripts\") pod \"nova-cell0-5519-account-create-update-nfplp\" (UID: \"d4dc7021-e4d0-4791-9760-4056d74989ad\") " pod="openstack/nova-cell0-5519-account-create-update-nfplp" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.169113 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-24f1-account-create-update-qfv25"] Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.171027 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-24f1-account-create-update-qfv25" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.175267 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v54x\" (UniqueName: \"kubernetes.io/projected/d4dc7021-e4d0-4791-9760-4056d74989ad-kube-api-access-5v54x\") pod \"nova-cell0-5519-account-create-update-nfplp\" (UID: \"d4dc7021-e4d0-4791-9760-4056d74989ad\") " pod="openstack/nova-cell0-5519-account-create-update-nfplp" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.179808 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.184705 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-24f1-account-create-update-qfv25"] Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.227350 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-csbvp" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.243108 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49wwk\" (UniqueName: \"kubernetes.io/projected/4e8b0526-5598-4458-8e28-c43557f08cf9-kube-api-access-49wwk\") pod \"nova-cell1-24f1-account-create-update-qfv25\" (UID: \"4e8b0526-5598-4458-8e28-c43557f08cf9\") " pod="openstack/nova-cell1-24f1-account-create-update-qfv25" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.243174 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e8b0526-5598-4458-8e28-c43557f08cf9-operator-scripts\") pod \"nova-cell1-24f1-account-create-update-qfv25\" (UID: \"4e8b0526-5598-4458-8e28-c43557f08cf9\") " pod="openstack/nova-cell1-24f1-account-create-update-qfv25" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.344675 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49wwk\" (UniqueName: \"kubernetes.io/projected/4e8b0526-5598-4458-8e28-c43557f08cf9-kube-api-access-49wwk\") pod \"nova-cell1-24f1-account-create-update-qfv25\" (UID: \"4e8b0526-5598-4458-8e28-c43557f08cf9\") " pod="openstack/nova-cell1-24f1-account-create-update-qfv25" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.344722 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e8b0526-5598-4458-8e28-c43557f08cf9-operator-scripts\") pod \"nova-cell1-24f1-account-create-update-qfv25\" (UID: \"4e8b0526-5598-4458-8e28-c43557f08cf9\") " pod="openstack/nova-cell1-24f1-account-create-update-qfv25" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.345670 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e8b0526-5598-4458-8e28-c43557f08cf9-operator-scripts\") pod \"nova-cell1-24f1-account-create-update-qfv25\" (UID: \"4e8b0526-5598-4458-8e28-c43557f08cf9\") " pod="openstack/nova-cell1-24f1-account-create-update-qfv25" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.350300 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.378700 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49wwk\" (UniqueName: \"kubernetes.io/projected/4e8b0526-5598-4458-8e28-c43557f08cf9-kube-api-access-49wwk\") pod \"nova-cell1-24f1-account-create-update-qfv25\" (UID: \"4e8b0526-5598-4458-8e28-c43557f08cf9\") " pod="openstack/nova-cell1-24f1-account-create-update-qfv25" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.384041 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5519-account-create-update-nfplp" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.530473 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-24f1-account-create-update-qfv25" Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.624089 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-clb84"] Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.722568 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7ppf8"] Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.787819 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-024a-account-create-update-b9tx5"] Dec 11 05:33:48 crc kubenswrapper[4628]: I1211 05:33:48.879512 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-csbvp"] Dec 11 05:33:49 crc kubenswrapper[4628]: I1211 05:33:49.000699 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-024a-account-create-update-b9tx5" event={"ID":"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b","Type":"ContainerStarted","Data":"9f8acebe5fbc20bafb3d5a4b7ce8d906600069e32cca32278da2dd1792f954d8"} Dec 11 05:33:49 crc kubenswrapper[4628]: I1211 05:33:49.002309 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7ppf8" event={"ID":"7642b050-2c3e-4a3d-bc5d-b5e007cf316f","Type":"ContainerStarted","Data":"e6815bc28378c823b251b74a68b853cb233d92ab511bf57a40a3421ecaf9b466"} Dec 11 05:33:49 crc kubenswrapper[4628]: I1211 05:33:49.007017 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-clb84" event={"ID":"bb00fa79-0866-4f48-b001-3c05352e47aa","Type":"ContainerStarted","Data":"729ad22eaf349f0f1464ab47631d75426a477a4441363885d342781a9d674e18"} Dec 11 05:33:49 crc kubenswrapper[4628]: I1211 05:33:49.008339 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-csbvp" event={"ID":"c3eb6441-7841-43ef-9036-08e2b3d43ed2","Type":"ContainerStarted","Data":"0e23ed70c76c25b763491f5f7f2f7e88655f4dd5380988543c1882e1ae9166a0"} Dec 11 05:33:49 crc kubenswrapper[4628]: I1211 05:33:49.078897 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5519-account-create-update-nfplp"] Dec 11 05:33:49 crc kubenswrapper[4628]: I1211 05:33:49.173113 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-24f1-account-create-update-qfv25"] Dec 11 05:33:49 crc kubenswrapper[4628]: W1211 05:33:49.191419 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e8b0526_5598_4458_8e28_c43557f08cf9.slice/crio-0e2d3083c90ea61312bb0a1cb9bc230aad76ef7b199c7cd2d28def72be9d3c1e WatchSource:0}: Error finding container 0e2d3083c90ea61312bb0a1cb9bc230aad76ef7b199c7cd2d28def72be9d3c1e: Status 404 returned error can't find the container with id 0e2d3083c90ea61312bb0a1cb9bc230aad76ef7b199c7cd2d28def72be9d3c1e Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.020070 4628 generic.go:334] "Generic (PLEG): container finished" podID="bb00fa79-0866-4f48-b001-3c05352e47aa" containerID="c2761e05717dbb3b4597a9e322f788ac9de6cba83cab188acb66e2fdbf827e36" exitCode=0 Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.020393 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-clb84" event={"ID":"bb00fa79-0866-4f48-b001-3c05352e47aa","Type":"ContainerDied","Data":"c2761e05717dbb3b4597a9e322f788ac9de6cba83cab188acb66e2fdbf827e36"} Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.022974 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5519-account-create-update-nfplp" event={"ID":"d4dc7021-e4d0-4791-9760-4056d74989ad","Type":"ContainerStarted","Data":"bf7a6621a5d2fc28767cc644433bddbabdae8defb17a6efe742d4e9ecbb29f44"} Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.023020 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5519-account-create-update-nfplp" event={"ID":"d4dc7021-e4d0-4791-9760-4056d74989ad","Type":"ContainerStarted","Data":"2574cfa98c00c2ad58dc69091fabfa074a45c5824ae5f0e452e6b23866c0ae9b"} Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.028731 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-csbvp" event={"ID":"c3eb6441-7841-43ef-9036-08e2b3d43ed2","Type":"ContainerStarted","Data":"acf62138e38a1eec9a1a8034713f4c840aa494d78ac1bd65b44df018f272e4ca"} Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.030704 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b" containerID="046dc26b9a5c282cbf7679fa1ed8db17517608d8f12451328b5bd0ae51a7888c" exitCode=0 Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.030752 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-024a-account-create-update-b9tx5" event={"ID":"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b","Type":"ContainerDied","Data":"046dc26b9a5c282cbf7679fa1ed8db17517608d8f12451328b5bd0ae51a7888c"} Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.035020 4628 generic.go:334] "Generic (PLEG): container finished" podID="7642b050-2c3e-4a3d-bc5d-b5e007cf316f" containerID="f814a17eaba6c66b5c4fc7918b5ed45d24e734b6aa1ba1d977e6859c4161547d" exitCode=0 Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.035061 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7ppf8" event={"ID":"7642b050-2c3e-4a3d-bc5d-b5e007cf316f","Type":"ContainerDied","Data":"f814a17eaba6c66b5c4fc7918b5ed45d24e734b6aa1ba1d977e6859c4161547d"} Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.047192 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-24f1-account-create-update-qfv25" event={"ID":"4e8b0526-5598-4458-8e28-c43557f08cf9","Type":"ContainerStarted","Data":"91d14f67d5473739a1fe6ef267a127788dcf7f4b06fbd3c8133b42eaa6bb21c4"} Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.047244 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-24f1-account-create-update-qfv25" event={"ID":"4e8b0526-5598-4458-8e28-c43557f08cf9","Type":"ContainerStarted","Data":"0e2d3083c90ea61312bb0a1cb9bc230aad76ef7b199c7cd2d28def72be9d3c1e"} Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.067072 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-5519-account-create-update-nfplp" podStartSLOduration=3.067051191 podStartE2EDuration="3.067051191s" podCreationTimestamp="2025-12-11 05:33:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:33:50.053171404 +0000 UTC m=+1132.470518102" watchObservedRunningTime="2025-12-11 05:33:50.067051191 +0000 UTC m=+1132.484397889" Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.106164 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-csbvp" podStartSLOduration=3.106141974 podStartE2EDuration="3.106141974s" podCreationTimestamp="2025-12-11 05:33:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:33:50.086173771 +0000 UTC m=+1132.503520469" watchObservedRunningTime="2025-12-11 05:33:50.106141974 +0000 UTC m=+1132.523488672" Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.121546 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-24f1-account-create-update-qfv25" podStartSLOduration=2.121526002 podStartE2EDuration="2.121526002s" podCreationTimestamp="2025-12-11 05:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:33:50.117001209 +0000 UTC m=+1132.534347907" watchObservedRunningTime="2025-12-11 05:33:50.121526002 +0000 UTC m=+1132.538872700" Dec 11 05:33:50 crc kubenswrapper[4628]: I1211 05:33:50.873733 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.012978 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-combined-ca-bundle\") pod \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.013057 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-run-httpd\") pod \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.013095 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw7fn\" (UniqueName: \"kubernetes.io/projected/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-kube-api-access-mw7fn\") pod \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.013118 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-scripts\") pod \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.013143 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-sg-core-conf-yaml\") pod \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.013192 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-log-httpd\") pod \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.013246 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-config-data\") pod \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\" (UID: \"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.015996 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" (UID: "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.016924 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" (UID: "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.028152 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-kube-api-access-mw7fn" (OuterVolumeSpecName: "kube-api-access-mw7fn") pod "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" (UID: "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e"). InnerVolumeSpecName "kube-api-access-mw7fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.043501 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-scripts" (OuterVolumeSpecName: "scripts") pod "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" (UID: "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.066979 4628 generic.go:334] "Generic (PLEG): container finished" podID="4e8b0526-5598-4458-8e28-c43557f08cf9" containerID="91d14f67d5473739a1fe6ef267a127788dcf7f4b06fbd3c8133b42eaa6bb21c4" exitCode=0 Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.067203 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-24f1-account-create-update-qfv25" event={"ID":"4e8b0526-5598-4458-8e28-c43557f08cf9","Type":"ContainerDied","Data":"91d14f67d5473739a1fe6ef267a127788dcf7f4b06fbd3c8133b42eaa6bb21c4"} Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.070399 4628 generic.go:334] "Generic (PLEG): container finished" podID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerID="b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96" exitCode=0 Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.070532 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e","Type":"ContainerDied","Data":"b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96"} Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.070611 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e","Type":"ContainerDied","Data":"777298802f2aaeecac8086d7ede94b9666712b64a0efdc0cf3809a34ace675fc"} Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.070698 4628 scope.go:117] "RemoveContainer" containerID="30996f796f9465a20b77a461096f57a78888f77c906e9e3a13126f5062293f9d" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.070951 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.075244 4628 generic.go:334] "Generic (PLEG): container finished" podID="d4dc7021-e4d0-4791-9760-4056d74989ad" containerID="bf7a6621a5d2fc28767cc644433bddbabdae8defb17a6efe742d4e9ecbb29f44" exitCode=0 Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.075394 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5519-account-create-update-nfplp" event={"ID":"d4dc7021-e4d0-4791-9760-4056d74989ad","Type":"ContainerDied","Data":"bf7a6621a5d2fc28767cc644433bddbabdae8defb17a6efe742d4e9ecbb29f44"} Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.076902 4628 generic.go:334] "Generic (PLEG): container finished" podID="c3eb6441-7841-43ef-9036-08e2b3d43ed2" containerID="acf62138e38a1eec9a1a8034713f4c840aa494d78ac1bd65b44df018f272e4ca" exitCode=0 Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.077186 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-csbvp" event={"ID":"c3eb6441-7841-43ef-9036-08e2b3d43ed2","Type":"ContainerDied","Data":"acf62138e38a1eec9a1a8034713f4c840aa494d78ac1bd65b44df018f272e4ca"} Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.102296 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" (UID: "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.117915 4628 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.117953 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw7fn\" (UniqueName: \"kubernetes.io/projected/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-kube-api-access-mw7fn\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.117967 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.117982 4628 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.117993 4628 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.145513 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" (UID: "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.204351 4628 scope.go:117] "RemoveContainer" containerID="95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.228384 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.268329 4628 scope.go:117] "RemoveContainer" containerID="9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.287934 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-config-data" (OuterVolumeSpecName: "config-data") pod "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" (UID: "e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.291719 4628 scope.go:117] "RemoveContainer" containerID="b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.330696 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.347991 4628 scope.go:117] "RemoveContainer" containerID="30996f796f9465a20b77a461096f57a78888f77c906e9e3a13126f5062293f9d" Dec 11 05:33:51 crc kubenswrapper[4628]: E1211 05:33:51.352216 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30996f796f9465a20b77a461096f57a78888f77c906e9e3a13126f5062293f9d\": container with ID starting with 30996f796f9465a20b77a461096f57a78888f77c906e9e3a13126f5062293f9d not found: ID does not exist" containerID="30996f796f9465a20b77a461096f57a78888f77c906e9e3a13126f5062293f9d" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.352363 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30996f796f9465a20b77a461096f57a78888f77c906e9e3a13126f5062293f9d"} err="failed to get container status \"30996f796f9465a20b77a461096f57a78888f77c906e9e3a13126f5062293f9d\": rpc error: code = NotFound desc = could not find container \"30996f796f9465a20b77a461096f57a78888f77c906e9e3a13126f5062293f9d\": container with ID starting with 30996f796f9465a20b77a461096f57a78888f77c906e9e3a13126f5062293f9d not found: ID does not exist" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.352462 4628 scope.go:117] "RemoveContainer" containerID="95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8" Dec 11 05:33:51 crc kubenswrapper[4628]: E1211 05:33:51.352743 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8\": container with ID starting with 95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8 not found: ID does not exist" containerID="95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.352875 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8"} err="failed to get container status \"95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8\": rpc error: code = NotFound desc = could not find container \"95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8\": container with ID starting with 95e0739db49bea7cb76d877674cb324da6a19c162bc571bfa1a53ef7cedef3a8 not found: ID does not exist" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.352971 4628 scope.go:117] "RemoveContainer" containerID="9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2" Dec 11 05:33:51 crc kubenswrapper[4628]: E1211 05:33:51.353262 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2\": container with ID starting with 9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2 not found: ID does not exist" containerID="9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.353361 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2"} err="failed to get container status \"9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2\": rpc error: code = NotFound desc = could not find container \"9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2\": container with ID starting with 9045086bafa9d4cc82907939c5d4de90708aff7d1110d2deec96d620223dd8b2 not found: ID does not exist" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.353442 4628 scope.go:117] "RemoveContainer" containerID="b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96" Dec 11 05:33:51 crc kubenswrapper[4628]: E1211 05:33:51.353933 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96\": container with ID starting with b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96 not found: ID does not exist" containerID="b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.353982 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96"} err="failed to get container status \"b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96\": rpc error: code = NotFound desc = could not find container \"b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96\": container with ID starting with b1a8a1d12154f968e3f262e43b08e8162ba97af1c6ae5202cb8a1b5000cb8f96 not found: ID does not exist" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.432504 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-clb84" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.436061 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.448807 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.477048 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:51 crc kubenswrapper[4628]: E1211 05:33:51.477546 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="proxy-httpd" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.477581 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="proxy-httpd" Dec 11 05:33:51 crc kubenswrapper[4628]: E1211 05:33:51.477593 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="ceilometer-central-agent" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.477602 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="ceilometer-central-agent" Dec 11 05:33:51 crc kubenswrapper[4628]: E1211 05:33:51.477618 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb00fa79-0866-4f48-b001-3c05352e47aa" containerName="mariadb-database-create" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.477624 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb00fa79-0866-4f48-b001-3c05352e47aa" containerName="mariadb-database-create" Dec 11 05:33:51 crc kubenswrapper[4628]: E1211 05:33:51.477664 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="ceilometer-notification-agent" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.477673 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="ceilometer-notification-agent" Dec 11 05:33:51 crc kubenswrapper[4628]: E1211 05:33:51.477692 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="sg-core" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.477699 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="sg-core" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.477964 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="ceilometer-central-agent" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.477987 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="sg-core" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.477994 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="proxy-httpd" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.478002 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" containerName="ceilometer-notification-agent" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.478039 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb00fa79-0866-4f48-b001-3c05352e47aa" containerName="mariadb-database-create" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.480208 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.482911 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.483155 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.483436 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.533033 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkm9w\" (UniqueName: \"kubernetes.io/projected/bb00fa79-0866-4f48-b001-3c05352e47aa-kube-api-access-qkm9w\") pod \"bb00fa79-0866-4f48-b001-3c05352e47aa\" (UID: \"bb00fa79-0866-4f48-b001-3c05352e47aa\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.533269 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb00fa79-0866-4f48-b001-3c05352e47aa-operator-scripts\") pod \"bb00fa79-0866-4f48-b001-3c05352e47aa\" (UID: \"bb00fa79-0866-4f48-b001-3c05352e47aa\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.533519 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-config-data\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.533542 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.533588 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-scripts\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.533605 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.533630 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.533663 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-log-httpd\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.533691 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-run-httpd\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.533707 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r52gl\" (UniqueName: \"kubernetes.io/projected/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-kube-api-access-r52gl\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.534367 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb00fa79-0866-4f48-b001-3c05352e47aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb00fa79-0866-4f48-b001-3c05352e47aa" (UID: "bb00fa79-0866-4f48-b001-3c05352e47aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.537461 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb00fa79-0866-4f48-b001-3c05352e47aa-kube-api-access-qkm9w" (OuterVolumeSpecName: "kube-api-access-qkm9w") pod "bb00fa79-0866-4f48-b001-3c05352e47aa" (UID: "bb00fa79-0866-4f48-b001-3c05352e47aa"). InnerVolumeSpecName "kube-api-access-qkm9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.547724 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7ppf8" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.547934 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.596819 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-024a-account-create-update-b9tx5" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.635491 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxxzl\" (UniqueName: \"kubernetes.io/projected/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b-kube-api-access-kxxzl\") pod \"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b\" (UID: \"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.635572 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q595n\" (UniqueName: \"kubernetes.io/projected/7642b050-2c3e-4a3d-bc5d-b5e007cf316f-kube-api-access-q595n\") pod \"7642b050-2c3e-4a3d-bc5d-b5e007cf316f\" (UID: \"7642b050-2c3e-4a3d-bc5d-b5e007cf316f\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.635606 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b-operator-scripts\") pod \"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b\" (UID: \"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.635640 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7642b050-2c3e-4a3d-bc5d-b5e007cf316f-operator-scripts\") pod \"7642b050-2c3e-4a3d-bc5d-b5e007cf316f\" (UID: \"7642b050-2c3e-4a3d-bc5d-b5e007cf316f\") " Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.635906 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-log-httpd\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.635941 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-run-httpd\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.635959 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r52gl\" (UniqueName: \"kubernetes.io/projected/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-kube-api-access-r52gl\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.636039 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-config-data\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.636073 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.636290 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7642b050-2c3e-4a3d-bc5d-b5e007cf316f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7642b050-2c3e-4a3d-bc5d-b5e007cf316f" (UID: "7642b050-2c3e-4a3d-bc5d-b5e007cf316f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.636326 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-scripts\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.636378 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.636409 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.636456 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb00fa79-0866-4f48-b001-3c05352e47aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.636467 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkm9w\" (UniqueName: \"kubernetes.io/projected/bb00fa79-0866-4f48-b001-3c05352e47aa-kube-api-access-qkm9w\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.636479 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7642b050-2c3e-4a3d-bc5d-b5e007cf316f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.636626 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-run-httpd\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.636686 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-log-httpd\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.637152 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b" (UID: "2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.639607 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7642b050-2c3e-4a3d-bc5d-b5e007cf316f-kube-api-access-q595n" (OuterVolumeSpecName: "kube-api-access-q595n") pod "7642b050-2c3e-4a3d-bc5d-b5e007cf316f" (UID: "7642b050-2c3e-4a3d-bc5d-b5e007cf316f"). InnerVolumeSpecName "kube-api-access-q595n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.641008 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-scripts\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.642865 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.643735 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.645541 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.645710 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b-kube-api-access-kxxzl" (OuterVolumeSpecName: "kube-api-access-kxxzl") pod "2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b" (UID: "2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b"). InnerVolumeSpecName "kube-api-access-kxxzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.646497 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-config-data\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.656540 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r52gl\" (UniqueName: \"kubernetes.io/projected/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-kube-api-access-r52gl\") pod \"ceilometer-0\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.715951 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.716405 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2696c26e-6fad-43c9-975f-f73149e0466d" containerName="glance-log" containerID="cri-o://f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224" gracePeriod=30 Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.716538 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2696c26e-6fad-43c9-975f-f73149e0466d" containerName="glance-httpd" containerID="cri-o://518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e" gracePeriod=30 Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.738545 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q595n\" (UniqueName: \"kubernetes.io/projected/7642b050-2c3e-4a3d-bc5d-b5e007cf316f-kube-api-access-q595n\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.738744 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.738830 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxxzl\" (UniqueName: \"kubernetes.io/projected/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b-kube-api-access-kxxzl\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.820885 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:33:51 crc kubenswrapper[4628]: I1211 05:33:51.917110 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e" path="/var/lib/kubelet/pods/e03b06e1-acfd-4eaa-9fb2-04e27b7ac61e/volumes" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.111458 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-024a-account-create-update-b9tx5" event={"ID":"2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b","Type":"ContainerDied","Data":"9f8acebe5fbc20bafb3d5a4b7ce8d906600069e32cca32278da2dd1792f954d8"} Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.111500 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f8acebe5fbc20bafb3d5a4b7ce8d906600069e32cca32278da2dd1792f954d8" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.111563 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-024a-account-create-update-b9tx5" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.135252 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7ppf8" event={"ID":"7642b050-2c3e-4a3d-bc5d-b5e007cf316f","Type":"ContainerDied","Data":"e6815bc28378c823b251b74a68b853cb233d92ab511bf57a40a3421ecaf9b466"} Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.135296 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6815bc28378c823b251b74a68b853cb233d92ab511bf57a40a3421ecaf9b466" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.135366 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7ppf8" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.155056 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-clb84" event={"ID":"bb00fa79-0866-4f48-b001-3c05352e47aa","Type":"ContainerDied","Data":"729ad22eaf349f0f1464ab47631d75426a477a4441363885d342781a9d674e18"} Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.155525 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="729ad22eaf349f0f1464ab47631d75426a477a4441363885d342781a9d674e18" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.155648 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-clb84" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.204049 4628 generic.go:334] "Generic (PLEG): container finished" podID="2696c26e-6fad-43c9-975f-f73149e0466d" containerID="f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224" exitCode=143 Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.204404 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2696c26e-6fad-43c9-975f-f73149e0466d","Type":"ContainerDied","Data":"f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224"} Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.508744 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.617713 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5519-account-create-update-nfplp" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.659196 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4dc7021-e4d0-4791-9760-4056d74989ad-operator-scripts\") pod \"d4dc7021-e4d0-4791-9760-4056d74989ad\" (UID: \"d4dc7021-e4d0-4791-9760-4056d74989ad\") " Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.660102 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dc7021-e4d0-4791-9760-4056d74989ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4dc7021-e4d0-4791-9760-4056d74989ad" (UID: "d4dc7021-e4d0-4791-9760-4056d74989ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.660273 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v54x\" (UniqueName: \"kubernetes.io/projected/d4dc7021-e4d0-4791-9760-4056d74989ad-kube-api-access-5v54x\") pod \"d4dc7021-e4d0-4791-9760-4056d74989ad\" (UID: \"d4dc7021-e4d0-4791-9760-4056d74989ad\") " Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.660679 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4dc7021-e4d0-4791-9760-4056d74989ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.665890 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4dc7021-e4d0-4791-9760-4056d74989ad-kube-api-access-5v54x" (OuterVolumeSpecName: "kube-api-access-5v54x") pod "d4dc7021-e4d0-4791-9760-4056d74989ad" (UID: "d4dc7021-e4d0-4791-9760-4056d74989ad"). InnerVolumeSpecName "kube-api-access-5v54x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.748067 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-csbvp" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.749884 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-24f1-account-create-update-qfv25" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.762719 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v54x\" (UniqueName: \"kubernetes.io/projected/d4dc7021-e4d0-4791-9760-4056d74989ad-kube-api-access-5v54x\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.864479 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e8b0526-5598-4458-8e28-c43557f08cf9-operator-scripts\") pod \"4e8b0526-5598-4458-8e28-c43557f08cf9\" (UID: \"4e8b0526-5598-4458-8e28-c43557f08cf9\") " Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.864559 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86mcs\" (UniqueName: \"kubernetes.io/projected/c3eb6441-7841-43ef-9036-08e2b3d43ed2-kube-api-access-86mcs\") pod \"c3eb6441-7841-43ef-9036-08e2b3d43ed2\" (UID: \"c3eb6441-7841-43ef-9036-08e2b3d43ed2\") " Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.864606 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49wwk\" (UniqueName: \"kubernetes.io/projected/4e8b0526-5598-4458-8e28-c43557f08cf9-kube-api-access-49wwk\") pod \"4e8b0526-5598-4458-8e28-c43557f08cf9\" (UID: \"4e8b0526-5598-4458-8e28-c43557f08cf9\") " Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.864677 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3eb6441-7841-43ef-9036-08e2b3d43ed2-operator-scripts\") pod \"c3eb6441-7841-43ef-9036-08e2b3d43ed2\" (UID: \"c3eb6441-7841-43ef-9036-08e2b3d43ed2\") " Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.865323 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e8b0526-5598-4458-8e28-c43557f08cf9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e8b0526-5598-4458-8e28-c43557f08cf9" (UID: "4e8b0526-5598-4458-8e28-c43557f08cf9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.865370 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3eb6441-7841-43ef-9036-08e2b3d43ed2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3eb6441-7841-43ef-9036-08e2b3d43ed2" (UID: "c3eb6441-7841-43ef-9036-08e2b3d43ed2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.869041 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e8b0526-5598-4458-8e28-c43557f08cf9-kube-api-access-49wwk" (OuterVolumeSpecName: "kube-api-access-49wwk") pod "4e8b0526-5598-4458-8e28-c43557f08cf9" (UID: "4e8b0526-5598-4458-8e28-c43557f08cf9"). InnerVolumeSpecName "kube-api-access-49wwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.885118 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3eb6441-7841-43ef-9036-08e2b3d43ed2-kube-api-access-86mcs" (OuterVolumeSpecName: "kube-api-access-86mcs") pod "c3eb6441-7841-43ef-9036-08e2b3d43ed2" (UID: "c3eb6441-7841-43ef-9036-08e2b3d43ed2"). InnerVolumeSpecName "kube-api-access-86mcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.955668 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.968709 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3eb6441-7841-43ef-9036-08e2b3d43ed2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.968940 4628 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e8b0526-5598-4458-8e28-c43557f08cf9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.969003 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86mcs\" (UniqueName: \"kubernetes.io/projected/c3eb6441-7841-43ef-9036-08e2b3d43ed2-kube-api-access-86mcs\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:52 crc kubenswrapper[4628]: I1211 05:33:52.969058 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49wwk\" (UniqueName: \"kubernetes.io/projected/4e8b0526-5598-4458-8e28-c43557f08cf9-kube-api-access-49wwk\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:53 crc kubenswrapper[4628]: I1211 05:33:53.215812 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-24f1-account-create-update-qfv25" event={"ID":"4e8b0526-5598-4458-8e28-c43557f08cf9","Type":"ContainerDied","Data":"0e2d3083c90ea61312bb0a1cb9bc230aad76ef7b199c7cd2d28def72be9d3c1e"} Dec 11 05:33:53 crc kubenswrapper[4628]: I1211 05:33:53.215867 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e2d3083c90ea61312bb0a1cb9bc230aad76ef7b199c7cd2d28def72be9d3c1e" Dec 11 05:33:53 crc kubenswrapper[4628]: I1211 05:33:53.215933 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-24f1-account-create-update-qfv25" Dec 11 05:33:53 crc kubenswrapper[4628]: I1211 05:33:53.219733 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5519-account-create-update-nfplp" Dec 11 05:33:53 crc kubenswrapper[4628]: I1211 05:33:53.219901 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5519-account-create-update-nfplp" event={"ID":"d4dc7021-e4d0-4791-9760-4056d74989ad","Type":"ContainerDied","Data":"2574cfa98c00c2ad58dc69091fabfa074a45c5824ae5f0e452e6b23866c0ae9b"} Dec 11 05:33:53 crc kubenswrapper[4628]: I1211 05:33:53.220025 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2574cfa98c00c2ad58dc69091fabfa074a45c5824ae5f0e452e6b23866c0ae9b" Dec 11 05:33:53 crc kubenswrapper[4628]: I1211 05:33:53.221412 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1","Type":"ContainerStarted","Data":"305cc49ff1d7f9ecb931faad4a3491f54ae32a99679ddb6f61b491a0baa399e4"} Dec 11 05:33:53 crc kubenswrapper[4628]: I1211 05:33:53.222792 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-csbvp" event={"ID":"c3eb6441-7841-43ef-9036-08e2b3d43ed2","Type":"ContainerDied","Data":"0e23ed70c76c25b763491f5f7f2f7e88655f4dd5380988543c1882e1ae9166a0"} Dec 11 05:33:53 crc kubenswrapper[4628]: I1211 05:33:53.222891 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e23ed70c76c25b763491f5f7f2f7e88655f4dd5380988543c1882e1ae9166a0" Dec 11 05:33:53 crc kubenswrapper[4628]: I1211 05:33:53.223026 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-csbvp" Dec 11 05:33:54 crc kubenswrapper[4628]: I1211 05:33:54.013327 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:33:54 crc kubenswrapper[4628]: I1211 05:33:54.014046 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eb26f327-99d0-4eb1-8c92-d36b17068b04" containerName="glance-httpd" containerID="cri-o://c9777ab6617b13d26700daea25e423aaf1c9264d7da8fef44a14c84ff2fc9ddb" gracePeriod=30 Dec 11 05:33:54 crc kubenswrapper[4628]: I1211 05:33:54.014255 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eb26f327-99d0-4eb1-8c92-d36b17068b04" containerName="glance-log" containerID="cri-o://d3ae4e692c01a7524ebae194f4f97f400110bc0aacbf136003e7360ed0404a03" gracePeriod=30 Dec 11 05:33:54 crc kubenswrapper[4628]: I1211 05:33:54.232923 4628 generic.go:334] "Generic (PLEG): container finished" podID="eb26f327-99d0-4eb1-8c92-d36b17068b04" containerID="d3ae4e692c01a7524ebae194f4f97f400110bc0aacbf136003e7360ed0404a03" exitCode=143 Dec 11 05:33:54 crc kubenswrapper[4628]: I1211 05:33:54.233025 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb26f327-99d0-4eb1-8c92-d36b17068b04","Type":"ContainerDied","Data":"d3ae4e692c01a7524ebae194f4f97f400110bc0aacbf136003e7360ed0404a03"} Dec 11 05:33:56 crc kubenswrapper[4628]: I1211 05:33:56.904585 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.024575 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-combined-ca-bundle\") pod \"2696c26e-6fad-43c9-975f-f73149e0466d\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.024694 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-config-data\") pod \"2696c26e-6fad-43c9-975f-f73149e0466d\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.024764 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td4nq\" (UniqueName: \"kubernetes.io/projected/2696c26e-6fad-43c9-975f-f73149e0466d-kube-api-access-td4nq\") pod \"2696c26e-6fad-43c9-975f-f73149e0466d\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.024804 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-scripts\") pod \"2696c26e-6fad-43c9-975f-f73149e0466d\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.024870 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"2696c26e-6fad-43c9-975f-f73149e0466d\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.024901 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2696c26e-6fad-43c9-975f-f73149e0466d-logs\") pod \"2696c26e-6fad-43c9-975f-f73149e0466d\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.024942 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-public-tls-certs\") pod \"2696c26e-6fad-43c9-975f-f73149e0466d\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.024977 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2696c26e-6fad-43c9-975f-f73149e0466d-httpd-run\") pod \"2696c26e-6fad-43c9-975f-f73149e0466d\" (UID: \"2696c26e-6fad-43c9-975f-f73149e0466d\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.025979 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2696c26e-6fad-43c9-975f-f73149e0466d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2696c26e-6fad-43c9-975f-f73149e0466d" (UID: "2696c26e-6fad-43c9-975f-f73149e0466d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.026545 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2696c26e-6fad-43c9-975f-f73149e0466d-logs" (OuterVolumeSpecName: "logs") pod "2696c26e-6fad-43c9-975f-f73149e0466d" (UID: "2696c26e-6fad-43c9-975f-f73149e0466d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.029834 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-scripts" (OuterVolumeSpecName: "scripts") pod "2696c26e-6fad-43c9-975f-f73149e0466d" (UID: "2696c26e-6fad-43c9-975f-f73149e0466d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.047047 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "2696c26e-6fad-43c9-975f-f73149e0466d" (UID: "2696c26e-6fad-43c9-975f-f73149e0466d"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.050010 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2696c26e-6fad-43c9-975f-f73149e0466d-kube-api-access-td4nq" (OuterVolumeSpecName: "kube-api-access-td4nq") pod "2696c26e-6fad-43c9-975f-f73149e0466d" (UID: "2696c26e-6fad-43c9-975f-f73149e0466d"). InnerVolumeSpecName "kube-api-access-td4nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.085233 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2696c26e-6fad-43c9-975f-f73149e0466d" (UID: "2696c26e-6fad-43c9-975f-f73149e0466d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.100957 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2696c26e-6fad-43c9-975f-f73149e0466d" (UID: "2696c26e-6fad-43c9-975f-f73149e0466d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.126987 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.127515 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td4nq\" (UniqueName: \"kubernetes.io/projected/2696c26e-6fad-43c9-975f-f73149e0466d-kube-api-access-td4nq\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.127606 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.127699 4628 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.127783 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2696c26e-6fad-43c9-975f-f73149e0466d-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.127941 4628 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.128005 4628 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2696c26e-6fad-43c9-975f-f73149e0466d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.142983 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-config-data" (OuterVolumeSpecName: "config-data") pod "2696c26e-6fad-43c9-975f-f73149e0466d" (UID: "2696c26e-6fad-43c9-975f-f73149e0466d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.151613 4628 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.230007 4628 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.230038 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2696c26e-6fad-43c9-975f-f73149e0466d-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.484592 4628 generic.go:334] "Generic (PLEG): container finished" podID="2696c26e-6fad-43c9-975f-f73149e0466d" containerID="518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e" exitCode=0 Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.484672 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2696c26e-6fad-43c9-975f-f73149e0466d","Type":"ContainerDied","Data":"518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e"} Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.484708 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2696c26e-6fad-43c9-975f-f73149e0466d","Type":"ContainerDied","Data":"dcec039349918899a7f6d8ee781f4d31e990784f3e5c12aab265a55e47971219"} Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.484726 4628 scope.go:117] "RemoveContainer" containerID="518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.484940 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.491227 4628 generic.go:334] "Generic (PLEG): container finished" podID="eb26f327-99d0-4eb1-8c92-d36b17068b04" containerID="c9777ab6617b13d26700daea25e423aaf1c9264d7da8fef44a14c84ff2fc9ddb" exitCode=0 Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.491356 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb26f327-99d0-4eb1-8c92-d36b17068b04","Type":"ContainerDied","Data":"c9777ab6617b13d26700daea25e423aaf1c9264d7da8fef44a14c84ff2fc9ddb"} Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.503395 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1","Type":"ContainerStarted","Data":"3d5c2b2171b0513e18b2e2572d848bed9a4ce3b19c9d21d1a01f5c8cb81d348b"} Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.503440 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1","Type":"ContainerStarted","Data":"ad36e59750b538a175bb80c3e4660791cd75b15c0c986387f5fcc396f4cba782"} Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.540463 4628 scope.go:117] "RemoveContainer" containerID="f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.581927 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.605746 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.628056 4628 scope.go:117] "RemoveContainer" containerID="518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e" Dec 11 05:33:57 crc kubenswrapper[4628]: E1211 05:33:57.638051 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e\": container with ID starting with 518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e not found: ID does not exist" containerID="518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.638104 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e"} err="failed to get container status \"518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e\": rpc error: code = NotFound desc = could not find container \"518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e\": container with ID starting with 518a8f0506ec80d70872849302f680dac00ac6b3fa2af58787e0ed1816b6ad3e not found: ID does not exist" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.638130 4628 scope.go:117] "RemoveContainer" containerID="f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224" Dec 11 05:33:57 crc kubenswrapper[4628]: E1211 05:33:57.639620 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224\": container with ID starting with f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224 not found: ID does not exist" containerID="f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.639643 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224"} err="failed to get container status \"f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224\": rpc error: code = NotFound desc = could not find container \"f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224\": container with ID starting with f84461630f989c2d9c1f50d65cd98133a40c9bb28140f44835fbe2cea1535224 not found: ID does not exist" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.643943 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:33:57 crc kubenswrapper[4628]: E1211 05:33:57.644377 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4dc7021-e4d0-4791-9760-4056d74989ad" containerName="mariadb-account-create-update" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644395 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4dc7021-e4d0-4791-9760-4056d74989ad" containerName="mariadb-account-create-update" Dec 11 05:33:57 crc kubenswrapper[4628]: E1211 05:33:57.644410 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7642b050-2c3e-4a3d-bc5d-b5e007cf316f" containerName="mariadb-database-create" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644417 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="7642b050-2c3e-4a3d-bc5d-b5e007cf316f" containerName="mariadb-database-create" Dec 11 05:33:57 crc kubenswrapper[4628]: E1211 05:33:57.644426 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2696c26e-6fad-43c9-975f-f73149e0466d" containerName="glance-httpd" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644432 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2696c26e-6fad-43c9-975f-f73149e0466d" containerName="glance-httpd" Dec 11 05:33:57 crc kubenswrapper[4628]: E1211 05:33:57.644447 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b" containerName="mariadb-account-create-update" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644453 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b" containerName="mariadb-account-create-update" Dec 11 05:33:57 crc kubenswrapper[4628]: E1211 05:33:57.644463 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3eb6441-7841-43ef-9036-08e2b3d43ed2" containerName="mariadb-database-create" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644469 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3eb6441-7841-43ef-9036-08e2b3d43ed2" containerName="mariadb-database-create" Dec 11 05:33:57 crc kubenswrapper[4628]: E1211 05:33:57.644481 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e8b0526-5598-4458-8e28-c43557f08cf9" containerName="mariadb-account-create-update" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644487 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e8b0526-5598-4458-8e28-c43557f08cf9" containerName="mariadb-account-create-update" Dec 11 05:33:57 crc kubenswrapper[4628]: E1211 05:33:57.644503 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2696c26e-6fad-43c9-975f-f73149e0466d" containerName="glance-log" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644508 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2696c26e-6fad-43c9-975f-f73149e0466d" containerName="glance-log" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644680 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4dc7021-e4d0-4791-9760-4056d74989ad" containerName="mariadb-account-create-update" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644694 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2696c26e-6fad-43c9-975f-f73149e0466d" containerName="glance-httpd" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644710 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="7642b050-2c3e-4a3d-bc5d-b5e007cf316f" containerName="mariadb-database-create" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644722 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3eb6441-7841-43ef-9036-08e2b3d43ed2" containerName="mariadb-database-create" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644730 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e8b0526-5598-4458-8e28-c43557f08cf9" containerName="mariadb-account-create-update" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644745 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2696c26e-6fad-43c9-975f-f73149e0466d" containerName="glance-log" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.644755 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b" containerName="mariadb-account-create-update" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.645852 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.654416 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.662062 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.662258 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.745595 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634cddf9-405e-42ee-a106-3c99b8d921d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.745648 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634cddf9-405e-42ee-a106-3c99b8d921d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.745685 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/634cddf9-405e-42ee-a106-3c99b8d921d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.745715 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634cddf9-405e-42ee-a106-3c99b8d921d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.745741 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mprd\" (UniqueName: \"kubernetes.io/projected/634cddf9-405e-42ee-a106-3c99b8d921d1-kube-api-access-6mprd\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.745763 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.745786 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/634cddf9-405e-42ee-a106-3c99b8d921d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.745822 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/634cddf9-405e-42ee-a106-3c99b8d921d1-logs\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.847319 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634cddf9-405e-42ee-a106-3c99b8d921d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.847364 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634cddf9-405e-42ee-a106-3c99b8d921d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.847403 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/634cddf9-405e-42ee-a106-3c99b8d921d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.847437 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634cddf9-405e-42ee-a106-3c99b8d921d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.847465 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mprd\" (UniqueName: \"kubernetes.io/projected/634cddf9-405e-42ee-a106-3c99b8d921d1-kube-api-access-6mprd\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.847487 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.847507 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/634cddf9-405e-42ee-a106-3c99b8d921d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.847550 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/634cddf9-405e-42ee-a106-3c99b8d921d1-logs\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.848150 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/634cddf9-405e-42ee-a106-3c99b8d921d1-logs\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.848220 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.848650 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/634cddf9-405e-42ee-a106-3c99b8d921d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.850544 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.850741 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.856416 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634cddf9-405e-42ee-a106-3c99b8d921d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.860468 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634cddf9-405e-42ee-a106-3c99b8d921d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.869038 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634cddf9-405e-42ee-a106-3c99b8d921d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.882803 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/634cddf9-405e-42ee-a106-3c99b8d921d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.893115 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mprd\" (UniqueName: \"kubernetes.io/projected/634cddf9-405e-42ee-a106-3c99b8d921d1-kube-api-access-6mprd\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.928306 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.939447 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"634cddf9-405e-42ee-a106-3c99b8d921d1\") " pod="openstack/glance-default-external-api-0" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.949237 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2696c26e-6fad-43c9-975f-f73149e0466d" path="/var/lib/kubelet/pods/2696c26e-6fad-43c9-975f-f73149e0466d/volumes" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.950810 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb26f327-99d0-4eb1-8c92-d36b17068b04-logs\") pod \"eb26f327-99d0-4eb1-8c92-d36b17068b04\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.951114 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-internal-tls-certs\") pod \"eb26f327-99d0-4eb1-8c92-d36b17068b04\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.951136 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"eb26f327-99d0-4eb1-8c92-d36b17068b04\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.951170 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-combined-ca-bundle\") pod \"eb26f327-99d0-4eb1-8c92-d36b17068b04\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.951233 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb26f327-99d0-4eb1-8c92-d36b17068b04-httpd-run\") pod \"eb26f327-99d0-4eb1-8c92-d36b17068b04\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.951253 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vhrr\" (UniqueName: \"kubernetes.io/projected/eb26f327-99d0-4eb1-8c92-d36b17068b04-kube-api-access-5vhrr\") pod \"eb26f327-99d0-4eb1-8c92-d36b17068b04\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.951268 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-scripts\") pod \"eb26f327-99d0-4eb1-8c92-d36b17068b04\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.951340 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-config-data\") pod \"eb26f327-99d0-4eb1-8c92-d36b17068b04\" (UID: \"eb26f327-99d0-4eb1-8c92-d36b17068b04\") " Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.955155 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb26f327-99d0-4eb1-8c92-d36b17068b04-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eb26f327-99d0-4eb1-8c92-d36b17068b04" (UID: "eb26f327-99d0-4eb1-8c92-d36b17068b04"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.957243 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb26f327-99d0-4eb1-8c92-d36b17068b04-logs" (OuterVolumeSpecName: "logs") pod "eb26f327-99d0-4eb1-8c92-d36b17068b04" (UID: "eb26f327-99d0-4eb1-8c92-d36b17068b04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.984291 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb26f327-99d0-4eb1-8c92-d36b17068b04-kube-api-access-5vhrr" (OuterVolumeSpecName: "kube-api-access-5vhrr") pod "eb26f327-99d0-4eb1-8c92-d36b17068b04" (UID: "eb26f327-99d0-4eb1-8c92-d36b17068b04"). InnerVolumeSpecName "kube-api-access-5vhrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:33:57 crc kubenswrapper[4628]: I1211 05:33:57.984516 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "eb26f327-99d0-4eb1-8c92-d36b17068b04" (UID: "eb26f327-99d0-4eb1-8c92-d36b17068b04"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.006311 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-scripts" (OuterVolumeSpecName: "scripts") pod "eb26f327-99d0-4eb1-8c92-d36b17068b04" (UID: "eb26f327-99d0-4eb1-8c92-d36b17068b04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.053130 4628 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb26f327-99d0-4eb1-8c92-d36b17068b04-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.053166 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vhrr\" (UniqueName: \"kubernetes.io/projected/eb26f327-99d0-4eb1-8c92-d36b17068b04-kube-api-access-5vhrr\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.053179 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.053187 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb26f327-99d0-4eb1-8c92-d36b17068b04-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.053217 4628 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.132015 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eb26f327-99d0-4eb1-8c92-d36b17068b04" (UID: "eb26f327-99d0-4eb1-8c92-d36b17068b04"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.134546 4628 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.147038 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-config-data" (OuterVolumeSpecName: "config-data") pod "eb26f327-99d0-4eb1-8c92-d36b17068b04" (UID: "eb26f327-99d0-4eb1-8c92-d36b17068b04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.155225 4628 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.155255 4628 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.155267 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.159161 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb26f327-99d0-4eb1-8c92-d36b17068b04" (UID: "eb26f327-99d0-4eb1-8c92-d36b17068b04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.185547 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.237290 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k6kds"] Dec 11 05:33:58 crc kubenswrapper[4628]: E1211 05:33:58.237667 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb26f327-99d0-4eb1-8c92-d36b17068b04" containerName="glance-log" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.237684 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb26f327-99d0-4eb1-8c92-d36b17068b04" containerName="glance-log" Dec 11 05:33:58 crc kubenswrapper[4628]: E1211 05:33:58.237700 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb26f327-99d0-4eb1-8c92-d36b17068b04" containerName="glance-httpd" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.237707 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb26f327-99d0-4eb1-8c92-d36b17068b04" containerName="glance-httpd" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.237898 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb26f327-99d0-4eb1-8c92-d36b17068b04" containerName="glance-httpd" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.237914 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb26f327-99d0-4eb1-8c92-d36b17068b04" containerName="glance-log" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.238484 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.241468 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-n2vkj" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.241722 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.242003 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.251881 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k6kds"] Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.256250 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-scripts\") pod \"nova-cell0-conductor-db-sync-k6kds\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.256296 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k6kds\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.256329 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-config-data\") pod \"nova-cell0-conductor-db-sync-k6kds\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.256390 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9g62\" (UniqueName: \"kubernetes.io/projected/58137696-e08d-4d26-ba22-d0fdb614485b-kube-api-access-d9g62\") pod \"nova-cell0-conductor-db-sync-k6kds\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.256445 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb26f327-99d0-4eb1-8c92-d36b17068b04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.357724 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-scripts\") pod \"nova-cell0-conductor-db-sync-k6kds\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.357976 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k6kds\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.358015 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-config-data\") pod \"nova-cell0-conductor-db-sync-k6kds\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.358058 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9g62\" (UniqueName: \"kubernetes.io/projected/58137696-e08d-4d26-ba22-d0fdb614485b-kube-api-access-d9g62\") pod \"nova-cell0-conductor-db-sync-k6kds\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.362814 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-scripts\") pod \"nova-cell0-conductor-db-sync-k6kds\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.374513 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-config-data\") pod \"nova-cell0-conductor-db-sync-k6kds\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.375816 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k6kds\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.438576 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9g62\" (UniqueName: \"kubernetes.io/projected/58137696-e08d-4d26-ba22-d0fdb614485b-kube-api-access-d9g62\") pod \"nova-cell0-conductor-db-sync-k6kds\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.560091 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1","Type":"ContainerStarted","Data":"23f3095de7368e8f8143ec4e684f14d98b43eb8c0342e7715cd67a98bfe71565"} Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.562806 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.587311 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eb26f327-99d0-4eb1-8c92-d36b17068b04","Type":"ContainerDied","Data":"948d42f7ce8fd2230a104db328e13fee4e18b517395b7097371dd1f6a2e67d19"} Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.587360 4628 scope.go:117] "RemoveContainer" containerID="c9777ab6617b13d26700daea25e423aaf1c9264d7da8fef44a14c84ff2fc9ddb" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.587491 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.692289 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.693275 4628 scope.go:117] "RemoveContainer" containerID="d3ae4e692c01a7524ebae194f4f97f400110bc0aacbf136003e7360ed0404a03" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.710823 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.719255 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.720993 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.723234 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.723484 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.728286 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.881968 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83368b19-5867-444e-a7ea-55683f0e6b26-scripts\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.882076 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfgb8\" (UniqueName: \"kubernetes.io/projected/83368b19-5867-444e-a7ea-55683f0e6b26-kube-api-access-nfgb8\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.882150 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83368b19-5867-444e-a7ea-55683f0e6b26-config-data\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.882167 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83368b19-5867-444e-a7ea-55683f0e6b26-logs\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.882254 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83368b19-5867-444e-a7ea-55683f0e6b26-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.882330 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83368b19-5867-444e-a7ea-55683f0e6b26-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.882356 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83368b19-5867-444e-a7ea-55683f0e6b26-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.882386 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.902961 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.983918 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfgb8\" (UniqueName: \"kubernetes.io/projected/83368b19-5867-444e-a7ea-55683f0e6b26-kube-api-access-nfgb8\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.984010 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83368b19-5867-444e-a7ea-55683f0e6b26-config-data\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.984039 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83368b19-5867-444e-a7ea-55683f0e6b26-logs\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.984107 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83368b19-5867-444e-a7ea-55683f0e6b26-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.984178 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83368b19-5867-444e-a7ea-55683f0e6b26-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.984205 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83368b19-5867-444e-a7ea-55683f0e6b26-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.984237 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.984277 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83368b19-5867-444e-a7ea-55683f0e6b26-scripts\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.985983 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83368b19-5867-444e-a7ea-55683f0e6b26-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.987215 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.993053 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83368b19-5867-444e-a7ea-55683f0e6b26-logs\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.994223 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83368b19-5867-444e-a7ea-55683f0e6b26-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.996412 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83368b19-5867-444e-a7ea-55683f0e6b26-config-data\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.998226 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83368b19-5867-444e-a7ea-55683f0e6b26-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:58 crc kubenswrapper[4628]: I1211 05:33:58.998658 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83368b19-5867-444e-a7ea-55683f0e6b26-scripts\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.015392 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfgb8\" (UniqueName: \"kubernetes.io/projected/83368b19-5867-444e-a7ea-55683f0e6b26-kube-api-access-nfgb8\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.044130 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"83368b19-5867-444e-a7ea-55683f0e6b26\") " pod="openstack/glance-default-internal-api-0" Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.052555 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.186172 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k6kds"] Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.606333 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k6kds" event={"ID":"58137696-e08d-4d26-ba22-d0fdb614485b","Type":"ContainerStarted","Data":"2cacb951f9adbd82b3e7d86b674a4d3649d712a0f6eec2ca73537bdbf8cd32ea"} Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.623144 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"634cddf9-405e-42ee-a106-3c99b8d921d1","Type":"ContainerStarted","Data":"3f46b08be1647cbe50dfde22e421085296ed4aedbeed3f41bf15dc6bb5baa31e"} Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.626282 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1","Type":"ContainerStarted","Data":"2efdc68d971110a49faf79db40108edbea4ed1e6180d6868835cd83223667511"} Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.626533 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="ceilometer-central-agent" containerID="cri-o://3d5c2b2171b0513e18b2e2572d848bed9a4ce3b19c9d21d1a01f5c8cb81d348b" gracePeriod=30 Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.626917 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.627324 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="proxy-httpd" containerID="cri-o://2efdc68d971110a49faf79db40108edbea4ed1e6180d6868835cd83223667511" gracePeriod=30 Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.627406 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="sg-core" containerID="cri-o://23f3095de7368e8f8143ec4e684f14d98b43eb8c0342e7715cd67a98bfe71565" gracePeriod=30 Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.627477 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="ceilometer-notification-agent" containerID="cri-o://ad36e59750b538a175bb80c3e4660791cd75b15c0c986387f5fcc396f4cba782" gracePeriod=30 Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.669435 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.109674321 podStartE2EDuration="8.669416801s" podCreationTimestamp="2025-12-11 05:33:51 +0000 UTC" firstStartedPulling="2025-12-11 05:33:52.513633616 +0000 UTC m=+1134.930980314" lastFinishedPulling="2025-12-11 05:33:59.073376106 +0000 UTC m=+1141.490722794" observedRunningTime="2025-12-11 05:33:59.652786639 +0000 UTC m=+1142.070133337" watchObservedRunningTime="2025-12-11 05:33:59.669416801 +0000 UTC m=+1142.086763499" Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.679053 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 11 05:33:59 crc kubenswrapper[4628]: W1211 05:33:59.698408 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83368b19_5867_444e_a7ea_55683f0e6b26.slice/crio-27b822715ea089c7cb6fa95559068ab85ce069daf2af89b03d6e2e65d2af6b36 WatchSource:0}: Error finding container 27b822715ea089c7cb6fa95559068ab85ce069daf2af89b03d6e2e65d2af6b36: Status 404 returned error can't find the container with id 27b822715ea089c7cb6fa95559068ab85ce069daf2af89b03d6e2e65d2af6b36 Dec 11 05:33:59 crc kubenswrapper[4628]: I1211 05:33:59.902806 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb26f327-99d0-4eb1-8c92-d36b17068b04" path="/var/lib/kubelet/pods/eb26f327-99d0-4eb1-8c92-d36b17068b04/volumes" Dec 11 05:34:00 crc kubenswrapper[4628]: I1211 05:34:00.678571 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"634cddf9-405e-42ee-a106-3c99b8d921d1","Type":"ContainerStarted","Data":"b0a605897ecda3a9cd2a8ef7d2faaed687f18074d6ae0fe6526abcfa3fdd0378"} Dec 11 05:34:00 crc kubenswrapper[4628]: I1211 05:34:00.699289 4628 generic.go:334] "Generic (PLEG): container finished" podID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerID="2efdc68d971110a49faf79db40108edbea4ed1e6180d6868835cd83223667511" exitCode=0 Dec 11 05:34:00 crc kubenswrapper[4628]: I1211 05:34:00.699320 4628 generic.go:334] "Generic (PLEG): container finished" podID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerID="23f3095de7368e8f8143ec4e684f14d98b43eb8c0342e7715cd67a98bfe71565" exitCode=2 Dec 11 05:34:00 crc kubenswrapper[4628]: I1211 05:34:00.699327 4628 generic.go:334] "Generic (PLEG): container finished" podID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerID="ad36e59750b538a175bb80c3e4660791cd75b15c0c986387f5fcc396f4cba782" exitCode=0 Dec 11 05:34:00 crc kubenswrapper[4628]: I1211 05:34:00.699385 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1","Type":"ContainerDied","Data":"2efdc68d971110a49faf79db40108edbea4ed1e6180d6868835cd83223667511"} Dec 11 05:34:00 crc kubenswrapper[4628]: I1211 05:34:00.699423 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1","Type":"ContainerDied","Data":"23f3095de7368e8f8143ec4e684f14d98b43eb8c0342e7715cd67a98bfe71565"} Dec 11 05:34:00 crc kubenswrapper[4628]: I1211 05:34:00.699432 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1","Type":"ContainerDied","Data":"ad36e59750b538a175bb80c3e4660791cd75b15c0c986387f5fcc396f4cba782"} Dec 11 05:34:00 crc kubenswrapper[4628]: I1211 05:34:00.708034 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"83368b19-5867-444e-a7ea-55683f0e6b26","Type":"ContainerStarted","Data":"10b1b7a187b6a409508c96038567b438ee72dd9f9f8566d5da5263e576927d33"} Dec 11 05:34:00 crc kubenswrapper[4628]: I1211 05:34:00.708091 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"83368b19-5867-444e-a7ea-55683f0e6b26","Type":"ContainerStarted","Data":"27b822715ea089c7cb6fa95559068ab85ce069daf2af89b03d6e2e65d2af6b36"} Dec 11 05:34:01 crc kubenswrapper[4628]: I1211 05:34:01.426643 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:34:01 crc kubenswrapper[4628]: I1211 05:34:01.426911 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:34:01 crc kubenswrapper[4628]: I1211 05:34:01.733722 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"634cddf9-405e-42ee-a106-3c99b8d921d1","Type":"ContainerStarted","Data":"45ef8b622ac98f422ad1504bb6835dbd30d33a41c338a09568c578805c4160d6"} Dec 11 05:34:01 crc kubenswrapper[4628]: I1211 05:34:01.762584 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.762565317 podStartE2EDuration="4.762565317s" podCreationTimestamp="2025-12-11 05:33:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:34:01.755552046 +0000 UTC m=+1144.172898744" watchObservedRunningTime="2025-12-11 05:34:01.762565317 +0000 UTC m=+1144.179912015" Dec 11 05:34:02 crc kubenswrapper[4628]: I1211 05:34:02.746340 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"83368b19-5867-444e-a7ea-55683f0e6b26","Type":"ContainerStarted","Data":"91536e225d66705725282f414485183b5d96ff576d30f8e1d10b9f6f1bcd3d99"} Dec 11 05:34:02 crc kubenswrapper[4628]: I1211 05:34:02.766500 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.7664824 podStartE2EDuration="4.7664824s" podCreationTimestamp="2025-12-11 05:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:34:02.762244425 +0000 UTC m=+1145.179591133" watchObservedRunningTime="2025-12-11 05:34:02.7664824 +0000 UTC m=+1145.183829098" Dec 11 05:34:08 crc kubenswrapper[4628]: I1211 05:34:08.186203 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 11 05:34:08 crc kubenswrapper[4628]: I1211 05:34:08.186782 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 11 05:34:08 crc kubenswrapper[4628]: I1211 05:34:08.229027 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 11 05:34:08 crc kubenswrapper[4628]: I1211 05:34:08.232154 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 11 05:34:08 crc kubenswrapper[4628]: I1211 05:34:08.826290 4628 generic.go:334] "Generic (PLEG): container finished" podID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerID="3d5c2b2171b0513e18b2e2572d848bed9a4ce3b19c9d21d1a01f5c8cb81d348b" exitCode=0 Dec 11 05:34:08 crc kubenswrapper[4628]: I1211 05:34:08.827304 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1","Type":"ContainerDied","Data":"3d5c2b2171b0513e18b2e2572d848bed9a4ce3b19c9d21d1a01f5c8cb81d348b"} Dec 11 05:34:08 crc kubenswrapper[4628]: I1211 05:34:08.827675 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 11 05:34:08 crc kubenswrapper[4628]: I1211 05:34:08.827717 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.053783 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.054047 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.083123 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.094063 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.430691 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.524718 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-run-httpd\") pod \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.524785 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-config-data\") pod \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.524834 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-combined-ca-bundle\") pod \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.524971 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-scripts\") pod \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.525117 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-sg-core-conf-yaml\") pod \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.525213 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-ceilometer-tls-certs\") pod \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.525232 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" (UID: "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.525271 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r52gl\" (UniqueName: \"kubernetes.io/projected/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-kube-api-access-r52gl\") pod \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.525310 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-log-httpd\") pod \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.525701 4628 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.526223 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" (UID: "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.530834 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-scripts" (OuterVolumeSpecName: "scripts") pod "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" (UID: "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.532051 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-kube-api-access-r52gl" (OuterVolumeSpecName: "kube-api-access-r52gl") pod "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" (UID: "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1"). InnerVolumeSpecName "kube-api-access-r52gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.569344 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" (UID: "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.594707 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" (UID: "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.607371 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" (UID: "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.626537 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-config-data" (OuterVolumeSpecName: "config-data") pod "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" (UID: "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.626617 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-config-data\") pod \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\" (UID: \"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1\") " Dec 11 05:34:09 crc kubenswrapper[4628]: W1211 05:34:09.626738 4628 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1/volumes/kubernetes.io~secret/config-data Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.626752 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-config-data" (OuterVolumeSpecName: "config-data") pod "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" (UID: "ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.627180 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.627200 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.627211 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.627220 4628 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.627228 4628 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.627237 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r52gl\" (UniqueName: \"kubernetes.io/projected/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-kube-api-access-r52gl\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.627245 4628 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.836773 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k6kds" event={"ID":"58137696-e08d-4d26-ba22-d0fdb614485b","Type":"ContainerStarted","Data":"329660e4821c297605c9d21981de80333adb9b65fd18540e9b3eb13602643569"} Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.841210 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1","Type":"ContainerDied","Data":"305cc49ff1d7f9ecb931faad4a3491f54ae32a99679ddb6f61b491a0baa399e4"} Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.841328 4628 scope.go:117] "RemoveContainer" containerID="2efdc68d971110a49faf79db40108edbea4ed1e6180d6868835cd83223667511" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.841479 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.846534 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.846564 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.865461 4628 scope.go:117] "RemoveContainer" containerID="23f3095de7368e8f8143ec4e684f14d98b43eb8c0342e7715cd67a98bfe71565" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.867305 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-k6kds" podStartSLOduration=1.8113446789999998 podStartE2EDuration="11.86728946s" podCreationTimestamp="2025-12-11 05:33:58 +0000 UTC" firstStartedPulling="2025-12-11 05:33:59.21147064 +0000 UTC m=+1141.628817338" lastFinishedPulling="2025-12-11 05:34:09.267415421 +0000 UTC m=+1151.684762119" observedRunningTime="2025-12-11 05:34:09.856960449 +0000 UTC m=+1152.274307147" watchObservedRunningTime="2025-12-11 05:34:09.86728946 +0000 UTC m=+1152.284636158" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.911637 4628 scope.go:117] "RemoveContainer" containerID="ad36e59750b538a175bb80c3e4660791cd75b15c0c986387f5fcc396f4cba782" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.914944 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.950170 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.954496 4628 scope.go:117] "RemoveContainer" containerID="3d5c2b2171b0513e18b2e2572d848bed9a4ce3b19c9d21d1a01f5c8cb81d348b" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.959044 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:34:09 crc kubenswrapper[4628]: E1211 05:34:09.959539 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="sg-core" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.959564 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="sg-core" Dec 11 05:34:09 crc kubenswrapper[4628]: E1211 05:34:09.959580 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="proxy-httpd" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.959588 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="proxy-httpd" Dec 11 05:34:09 crc kubenswrapper[4628]: E1211 05:34:09.959601 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="ceilometer-notification-agent" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.959612 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="ceilometer-notification-agent" Dec 11 05:34:09 crc kubenswrapper[4628]: E1211 05:34:09.959631 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="ceilometer-central-agent" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.959639 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="ceilometer-central-agent" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.959838 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="ceilometer-notification-agent" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.959888 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="sg-core" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.959911 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="proxy-httpd" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.959923 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" containerName="ceilometer-central-agent" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.961796 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.966817 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.967090 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.972161 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:34:09 crc kubenswrapper[4628]: I1211 05:34:09.975303 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.035849 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-config-data\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.035934 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlcf9\" (UniqueName: \"kubernetes.io/projected/350c7aef-5a63-4478-b857-a2ad272d4d75-kube-api-access-vlcf9\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.035976 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-scripts\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.035991 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.036047 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.036107 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350c7aef-5a63-4478-b857-a2ad272d4d75-log-httpd\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.036133 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.036154 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350c7aef-5a63-4478-b857-a2ad272d4d75-run-httpd\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.140371 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350c7aef-5a63-4478-b857-a2ad272d4d75-run-httpd\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.140448 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-config-data\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.140481 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlcf9\" (UniqueName: \"kubernetes.io/projected/350c7aef-5a63-4478-b857-a2ad272d4d75-kube-api-access-vlcf9\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.140506 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-scripts\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.140524 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.140581 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.140633 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350c7aef-5a63-4478-b857-a2ad272d4d75-log-httpd\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.140658 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.140892 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350c7aef-5a63-4478-b857-a2ad272d4d75-run-httpd\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.141447 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350c7aef-5a63-4478-b857-a2ad272d4d75-log-httpd\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.145923 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.146090 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-config-data\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.148056 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-scripts\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.149043 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.149342 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.173554 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlcf9\" (UniqueName: \"kubernetes.io/projected/350c7aef-5a63-4478-b857-a2ad272d4d75-kube-api-access-vlcf9\") pod \"ceilometer-0\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.289082 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.620456 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:34:10 crc kubenswrapper[4628]: W1211 05:34:10.641967 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod350c7aef_5a63_4478_b857_a2ad272d4d75.slice/crio-4cbb2495533b619e63e5c5267286d8b272b3253edf7a81c419c0675ec7e14a64 WatchSource:0}: Error finding container 4cbb2495533b619e63e5c5267286d8b272b3253edf7a81c419c0675ec7e14a64: Status 404 returned error can't find the container with id 4cbb2495533b619e63e5c5267286d8b272b3253edf7a81c419c0675ec7e14a64 Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.849056 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350c7aef-5a63-4478-b857-a2ad272d4d75","Type":"ContainerStarted","Data":"4cbb2495533b619e63e5c5267286d8b272b3253edf7a81c419c0675ec7e14a64"} Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.850290 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:34:10 crc kubenswrapper[4628]: I1211 05:34:10.850307 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:34:11 crc kubenswrapper[4628]: I1211 05:34:11.414584 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 11 05:34:11 crc kubenswrapper[4628]: I1211 05:34:11.421101 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 11 05:34:11 crc kubenswrapper[4628]: I1211 05:34:11.864009 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:34:11 crc kubenswrapper[4628]: I1211 05:34:11.864374 4628 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 11 05:34:11 crc kubenswrapper[4628]: I1211 05:34:11.904401 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1" path="/var/lib/kubelet/pods/ff0ed79d-48fa-4a6a-8a02-91d5d4e35fe1/volumes" Dec 11 05:34:12 crc kubenswrapper[4628]: I1211 05:34:12.062796 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 11 05:34:12 crc kubenswrapper[4628]: I1211 05:34:12.200392 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 11 05:34:12 crc kubenswrapper[4628]: I1211 05:34:12.875641 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350c7aef-5a63-4478-b857-a2ad272d4d75","Type":"ContainerStarted","Data":"898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb"} Dec 11 05:34:13 crc kubenswrapper[4628]: I1211 05:34:13.884921 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350c7aef-5a63-4478-b857-a2ad272d4d75","Type":"ContainerStarted","Data":"a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7"} Dec 11 05:34:13 crc kubenswrapper[4628]: I1211 05:34:13.885210 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350c7aef-5a63-4478-b857-a2ad272d4d75","Type":"ContainerStarted","Data":"6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd"} Dec 11 05:34:15 crc kubenswrapper[4628]: I1211 05:34:15.912834 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350c7aef-5a63-4478-b857-a2ad272d4d75","Type":"ContainerStarted","Data":"52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad"} Dec 11 05:34:15 crc kubenswrapper[4628]: I1211 05:34:15.913589 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 05:34:15 crc kubenswrapper[4628]: I1211 05:34:15.948348 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.661562631 podStartE2EDuration="6.948322545s" podCreationTimestamp="2025-12-11 05:34:09 +0000 UTC" firstStartedPulling="2025-12-11 05:34:10.64597476 +0000 UTC m=+1153.063321448" lastFinishedPulling="2025-12-11 05:34:14.932734664 +0000 UTC m=+1157.350081362" observedRunningTime="2025-12-11 05:34:15.943941555 +0000 UTC m=+1158.361288263" watchObservedRunningTime="2025-12-11 05:34:15.948322545 +0000 UTC m=+1158.365669273" Dec 11 05:34:22 crc kubenswrapper[4628]: I1211 05:34:22.017176 4628 generic.go:334] "Generic (PLEG): container finished" podID="58137696-e08d-4d26-ba22-d0fdb614485b" containerID="329660e4821c297605c9d21981de80333adb9b65fd18540e9b3eb13602643569" exitCode=0 Dec 11 05:34:22 crc kubenswrapper[4628]: I1211 05:34:22.017280 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k6kds" event={"ID":"58137696-e08d-4d26-ba22-d0fdb614485b","Type":"ContainerDied","Data":"329660e4821c297605c9d21981de80333adb9b65fd18540e9b3eb13602643569"} Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.405428 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.511117 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9g62\" (UniqueName: \"kubernetes.io/projected/58137696-e08d-4d26-ba22-d0fdb614485b-kube-api-access-d9g62\") pod \"58137696-e08d-4d26-ba22-d0fdb614485b\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.511209 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-combined-ca-bundle\") pod \"58137696-e08d-4d26-ba22-d0fdb614485b\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.511340 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-scripts\") pod \"58137696-e08d-4d26-ba22-d0fdb614485b\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.511419 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-config-data\") pod \"58137696-e08d-4d26-ba22-d0fdb614485b\" (UID: \"58137696-e08d-4d26-ba22-d0fdb614485b\") " Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.517986 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58137696-e08d-4d26-ba22-d0fdb614485b-kube-api-access-d9g62" (OuterVolumeSpecName: "kube-api-access-d9g62") pod "58137696-e08d-4d26-ba22-d0fdb614485b" (UID: "58137696-e08d-4d26-ba22-d0fdb614485b"). InnerVolumeSpecName "kube-api-access-d9g62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.522338 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-scripts" (OuterVolumeSpecName: "scripts") pod "58137696-e08d-4d26-ba22-d0fdb614485b" (UID: "58137696-e08d-4d26-ba22-d0fdb614485b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.538260 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-config-data" (OuterVolumeSpecName: "config-data") pod "58137696-e08d-4d26-ba22-d0fdb614485b" (UID: "58137696-e08d-4d26-ba22-d0fdb614485b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.543060 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58137696-e08d-4d26-ba22-d0fdb614485b" (UID: "58137696-e08d-4d26-ba22-d0fdb614485b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.613439 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.613515 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.613537 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9g62\" (UniqueName: \"kubernetes.io/projected/58137696-e08d-4d26-ba22-d0fdb614485b-kube-api-access-d9g62\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:23 crc kubenswrapper[4628]: I1211 05:34:23.613557 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58137696-e08d-4d26-ba22-d0fdb614485b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.042088 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k6kds" event={"ID":"58137696-e08d-4d26-ba22-d0fdb614485b","Type":"ContainerDied","Data":"2cacb951f9adbd82b3e7d86b674a4d3649d712a0f6eec2ca73537bdbf8cd32ea"} Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.042144 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cacb951f9adbd82b3e7d86b674a4d3649d712a0f6eec2ca73537bdbf8cd32ea" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.042789 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k6kds" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.168482 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 11 05:34:24 crc kubenswrapper[4628]: E1211 05:34:24.169102 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58137696-e08d-4d26-ba22-d0fdb614485b" containerName="nova-cell0-conductor-db-sync" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.169119 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="58137696-e08d-4d26-ba22-d0fdb614485b" containerName="nova-cell0-conductor-db-sync" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.169304 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="58137696-e08d-4d26-ba22-d0fdb614485b" containerName="nova-cell0-conductor-db-sync" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.169898 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.172929 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-n2vkj" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.173185 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.188037 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.225880 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc5f4b2-0364-42a0-abba-16c0e471f5c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ccc5f4b2-0364-42a0-abba-16c0e471f5c6\") " pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.226039 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k7nz\" (UniqueName: \"kubernetes.io/projected/ccc5f4b2-0364-42a0-abba-16c0e471f5c6-kube-api-access-2k7nz\") pod \"nova-cell0-conductor-0\" (UID: \"ccc5f4b2-0364-42a0-abba-16c0e471f5c6\") " pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.226124 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc5f4b2-0364-42a0-abba-16c0e471f5c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ccc5f4b2-0364-42a0-abba-16c0e471f5c6\") " pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.327457 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k7nz\" (UniqueName: \"kubernetes.io/projected/ccc5f4b2-0364-42a0-abba-16c0e471f5c6-kube-api-access-2k7nz\") pod \"nova-cell0-conductor-0\" (UID: \"ccc5f4b2-0364-42a0-abba-16c0e471f5c6\") " pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.327537 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc5f4b2-0364-42a0-abba-16c0e471f5c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ccc5f4b2-0364-42a0-abba-16c0e471f5c6\") " pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.327617 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc5f4b2-0364-42a0-abba-16c0e471f5c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ccc5f4b2-0364-42a0-abba-16c0e471f5c6\") " pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.331547 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccc5f4b2-0364-42a0-abba-16c0e471f5c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ccc5f4b2-0364-42a0-abba-16c0e471f5c6\") " pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.331602 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccc5f4b2-0364-42a0-abba-16c0e471f5c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ccc5f4b2-0364-42a0-abba-16c0e471f5c6\") " pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.344041 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k7nz\" (UniqueName: \"kubernetes.io/projected/ccc5f4b2-0364-42a0-abba-16c0e471f5c6-kube-api-access-2k7nz\") pod \"nova-cell0-conductor-0\" (UID: \"ccc5f4b2-0364-42a0-abba-16c0e471f5c6\") " pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.503352 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:24 crc kubenswrapper[4628]: I1211 05:34:24.969083 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 11 05:34:25 crc kubenswrapper[4628]: I1211 05:34:25.055955 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ccc5f4b2-0364-42a0-abba-16c0e471f5c6","Type":"ContainerStarted","Data":"56e9292922927af248bd08cf0b621f1e2b9f3f1796177bccfe8cfeda2efaacba"} Dec 11 05:34:26 crc kubenswrapper[4628]: I1211 05:34:26.075215 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ccc5f4b2-0364-42a0-abba-16c0e471f5c6","Type":"ContainerStarted","Data":"4fa21d13a5924745cc5286c2b4e1e36129601fe0f0acf9afbd4b6de17422f24d"} Dec 11 05:34:26 crc kubenswrapper[4628]: I1211 05:34:26.076389 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:31 crc kubenswrapper[4628]: I1211 05:34:31.427686 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:34:31 crc kubenswrapper[4628]: I1211 05:34:31.428635 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:34:31 crc kubenswrapper[4628]: I1211 05:34:31.428716 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:34:31 crc kubenswrapper[4628]: I1211 05:34:31.429889 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ecc8b439306d6103b7fabe922fa79c181a14fe03fbd4c6f00b4023e3934e67c"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 05:34:31 crc kubenswrapper[4628]: I1211 05:34:31.429982 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://6ecc8b439306d6103b7fabe922fa79c181a14fe03fbd4c6f00b4023e3934e67c" gracePeriod=600 Dec 11 05:34:32 crc kubenswrapper[4628]: I1211 05:34:32.149377 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="6ecc8b439306d6103b7fabe922fa79c181a14fe03fbd4c6f00b4023e3934e67c" exitCode=0 Dec 11 05:34:32 crc kubenswrapper[4628]: I1211 05:34:32.149462 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"6ecc8b439306d6103b7fabe922fa79c181a14fe03fbd4c6f00b4023e3934e67c"} Dec 11 05:34:32 crc kubenswrapper[4628]: I1211 05:34:32.149677 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"a5a1b35de252bf7b6d284e501103fc4953df20ee7d9a62a56c9a69ef2d0ee180"} Dec 11 05:34:32 crc kubenswrapper[4628]: I1211 05:34:32.149702 4628 scope.go:117] "RemoveContainer" containerID="4024bd10762b90d0b487ed903bd8b69e2ebeac5fe50ac7d4b3037fdf7a40c2b1" Dec 11 05:34:32 crc kubenswrapper[4628]: I1211 05:34:32.177322 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=8.177297389 podStartE2EDuration="8.177297389s" podCreationTimestamp="2025-12-11 05:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:34:26.104830308 +0000 UTC m=+1168.522177016" watchObservedRunningTime="2025-12-11 05:34:32.177297389 +0000 UTC m=+1174.594644097" Dec 11 05:34:34 crc kubenswrapper[4628]: I1211 05:34:34.545096 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.029935 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-7bm5v"] Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.032402 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.035912 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.037641 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.039609 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7bm5v"] Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.140940 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-scripts\") pod \"nova-cell0-cell-mapping-7bm5v\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.141017 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skgzv\" (UniqueName: \"kubernetes.io/projected/d090d098-9d30-4ee0-89e6-a408f1340325-kube-api-access-skgzv\") pod \"nova-cell0-cell-mapping-7bm5v\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.141086 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-config-data\") pod \"nova-cell0-cell-mapping-7bm5v\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.141128 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7bm5v\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.243059 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7bm5v\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.243145 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-scripts\") pod \"nova-cell0-cell-mapping-7bm5v\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.243215 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skgzv\" (UniqueName: \"kubernetes.io/projected/d090d098-9d30-4ee0-89e6-a408f1340325-kube-api-access-skgzv\") pod \"nova-cell0-cell-mapping-7bm5v\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.243265 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-config-data\") pod \"nova-cell0-cell-mapping-7bm5v\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.256563 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-scripts\") pod \"nova-cell0-cell-mapping-7bm5v\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.256624 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-config-data\") pod \"nova-cell0-cell-mapping-7bm5v\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.262071 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7bm5v\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.274144 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.284214 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skgzv\" (UniqueName: \"kubernetes.io/projected/d090d098-9d30-4ee0-89e6-a408f1340325-kube-api-access-skgzv\") pod \"nova-cell0-cell-mapping-7bm5v\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.287066 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.296162 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.304082 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.346619 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.346669 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cqht\" (UniqueName: \"kubernetes.io/projected/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-kube-api-access-7cqht\") pod \"nova-api-0\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.346719 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-config-data\") pod \"nova-api-0\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.346778 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-logs\") pod \"nova-api-0\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.354559 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.433230 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.435125 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.451438 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-logs\") pod \"nova-api-0\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.451502 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a509ccd-0386-4db9-9648-ad9a5618b726-logs\") pod \"nova-metadata-0\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.451546 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a509ccd-0386-4db9-9648-ad9a5618b726-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.451575 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a509ccd-0386-4db9-9648-ad9a5618b726-config-data\") pod \"nova-metadata-0\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.451620 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.451648 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cqht\" (UniqueName: \"kubernetes.io/projected/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-kube-api-access-7cqht\") pod \"nova-api-0\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.451704 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-config-data\") pod \"nova-api-0\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.451754 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72g7\" (UniqueName: \"kubernetes.io/projected/2a509ccd-0386-4db9-9648-ad9a5618b726-kube-api-access-v72g7\") pod \"nova-metadata-0\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.452277 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-logs\") pod \"nova-api-0\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.454372 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.455929 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.471435 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-config-data\") pod \"nova-api-0\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.489586 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.500523 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cqht\" (UniqueName: \"kubernetes.io/projected/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-kube-api-access-7cqht\") pod \"nova-api-0\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.562603 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a509ccd-0386-4db9-9648-ad9a5618b726-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.562644 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a509ccd-0386-4db9-9648-ad9a5618b726-config-data\") pod \"nova-metadata-0\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.562747 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72g7\" (UniqueName: \"kubernetes.io/projected/2a509ccd-0386-4db9-9648-ad9a5618b726-kube-api-access-v72g7\") pod \"nova-metadata-0\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.562777 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a509ccd-0386-4db9-9648-ad9a5618b726-logs\") pod \"nova-metadata-0\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.567206 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a509ccd-0386-4db9-9648-ad9a5618b726-logs\") pod \"nova-metadata-0\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.573112 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a509ccd-0386-4db9-9648-ad9a5618b726-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.573155 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a509ccd-0386-4db9-9648-ad9a5618b726-config-data\") pod \"nova-metadata-0\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.589985 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.591220 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.596352 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.601559 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72g7\" (UniqueName: \"kubernetes.io/projected/2a509ccd-0386-4db9-9648-ad9a5618b726-kube-api-access-v72g7\") pod \"nova-metadata-0\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.611187 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.685784 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bddbc62d-5dcc-4ad3-9e74-adf443315395-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bddbc62d-5dcc-4ad3-9e74-adf443315395\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.692638 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.714318 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bddbc62d-5dcc-4ad3-9e74-adf443315395-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bddbc62d-5dcc-4ad3-9e74-adf443315395\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.714380 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zlw5\" (UniqueName: \"kubernetes.io/projected/bddbc62d-5dcc-4ad3-9e74-adf443315395-kube-api-access-4zlw5\") pod \"nova-cell1-novncproxy-0\" (UID: \"bddbc62d-5dcc-4ad3-9e74-adf443315395\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.755943 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9mx9c"] Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.768691 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.831055 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9mx9c"] Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.832877 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bddbc62d-5dcc-4ad3-9e74-adf443315395-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bddbc62d-5dcc-4ad3-9e74-adf443315395\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.832909 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zlw5\" (UniqueName: \"kubernetes.io/projected/bddbc62d-5dcc-4ad3-9e74-adf443315395-kube-api-access-4zlw5\") pod \"nova-cell1-novncproxy-0\" (UID: \"bddbc62d-5dcc-4ad3-9e74-adf443315395\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.832935 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-config\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.832959 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.832979 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8d4b\" (UniqueName: \"kubernetes.io/projected/532398ad-d877-4e17-a86e-d403bc8a6678-kube-api-access-g8d4b\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.833023 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-dns-svc\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.833082 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.833098 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bddbc62d-5dcc-4ad3-9e74-adf443315395-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bddbc62d-5dcc-4ad3-9e74-adf443315395\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.833133 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.845034 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bddbc62d-5dcc-4ad3-9e74-adf443315395-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bddbc62d-5dcc-4ad3-9e74-adf443315395\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.845557 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bddbc62d-5dcc-4ad3-9e74-adf443315395-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bddbc62d-5dcc-4ad3-9e74-adf443315395\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.847019 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.872313 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zlw5\" (UniqueName: \"kubernetes.io/projected/bddbc62d-5dcc-4ad3-9e74-adf443315395-kube-api-access-4zlw5\") pod \"nova-cell1-novncproxy-0\" (UID: \"bddbc62d-5dcc-4ad3-9e74-adf443315395\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.884225 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.886223 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.892494 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.935791 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4653a43-d598-4252-9f50-f2d14521d44c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e4653a43-d598-4252-9f50-f2d14521d44c\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.936022 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-config\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.936094 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnmdr\" (UniqueName: \"kubernetes.io/projected/e4653a43-d598-4252-9f50-f2d14521d44c-kube-api-access-tnmdr\") pod \"nova-scheduler-0\" (UID: \"e4653a43-d598-4252-9f50-f2d14521d44c\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.936251 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.936309 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8d4b\" (UniqueName: \"kubernetes.io/projected/532398ad-d877-4e17-a86e-d403bc8a6678-kube-api-access-g8d4b\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.936458 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-dns-svc\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.936501 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4653a43-d598-4252-9f50-f2d14521d44c-config-data\") pod \"nova-scheduler-0\" (UID: \"e4653a43-d598-4252-9f50-f2d14521d44c\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.936670 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.936762 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.940901 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.941456 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.942000 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.946181 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.948991 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-dns-svc\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.955748 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-config\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.955953 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:34:35 crc kubenswrapper[4628]: I1211 05:34:35.974635 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8d4b\" (UniqueName: \"kubernetes.io/projected/532398ad-d877-4e17-a86e-d403bc8a6678-kube-api-access-g8d4b\") pod \"dnsmasq-dns-757b4f8459-9mx9c\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.044227 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4653a43-d598-4252-9f50-f2d14521d44c-config-data\") pod \"nova-scheduler-0\" (UID: \"e4653a43-d598-4252-9f50-f2d14521d44c\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.044381 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4653a43-d598-4252-9f50-f2d14521d44c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e4653a43-d598-4252-9f50-f2d14521d44c\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.044421 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnmdr\" (UniqueName: \"kubernetes.io/projected/e4653a43-d598-4252-9f50-f2d14521d44c-kube-api-access-tnmdr\") pod \"nova-scheduler-0\" (UID: \"e4653a43-d598-4252-9f50-f2d14521d44c\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.048294 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4653a43-d598-4252-9f50-f2d14521d44c-config-data\") pod \"nova-scheduler-0\" (UID: \"e4653a43-d598-4252-9f50-f2d14521d44c\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.053516 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4653a43-d598-4252-9f50-f2d14521d44c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e4653a43-d598-4252-9f50-f2d14521d44c\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.067631 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnmdr\" (UniqueName: \"kubernetes.io/projected/e4653a43-d598-4252-9f50-f2d14521d44c-kube-api-access-tnmdr\") pod \"nova-scheduler-0\" (UID: \"e4653a43-d598-4252-9f50-f2d14521d44c\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.135219 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.145914 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7bm5v"] Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.239153 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.373620 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.396465 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.490604 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:36 crc kubenswrapper[4628]: W1211 05:34:36.531798 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a509ccd_0386_4db9_9648_ad9a5618b726.slice/crio-e8fa5c04d9bb0db64bbf31f0e1238c47a545469b79bc784f70990f37a15c558e WatchSource:0}: Error finding container e8fa5c04d9bb0db64bbf31f0e1238c47a545469b79bc784f70990f37a15c558e: Status 404 returned error can't find the container with id e8fa5c04d9bb0db64bbf31f0e1238c47a545469b79bc784f70990f37a15c558e Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.632179 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.846060 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9mx9c"] Dec 11 05:34:36 crc kubenswrapper[4628]: W1211 05:34:36.851958 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod532398ad_d877_4e17_a86e_d403bc8a6678.slice/crio-0af8af14f1977b99f7f75e2bc22b833a3d2c1a99dc96510621756d33bdf77d35 WatchSource:0}: Error finding container 0af8af14f1977b99f7f75e2bc22b833a3d2c1a99dc96510621756d33bdf77d35: Status 404 returned error can't find the container with id 0af8af14f1977b99f7f75e2bc22b833a3d2c1a99dc96510621756d33bdf77d35 Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.890979 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nsgqf"] Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.892354 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.895378 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.895575 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.901616 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nsgqf"] Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.970122 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-config-data\") pod \"nova-cell1-conductor-db-sync-nsgqf\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.970211 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nsgqf\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.970228 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-scripts\") pod \"nova-cell1-conductor-db-sync-nsgqf\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:36 crc kubenswrapper[4628]: I1211 05:34:36.970284 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c94s\" (UniqueName: \"kubernetes.io/projected/4df8deca-75c5-40b2-a666-4b1c6050c273-kube-api-access-6c94s\") pod \"nova-cell1-conductor-db-sync-nsgqf\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.013072 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.072364 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-config-data\") pod \"nova-cell1-conductor-db-sync-nsgqf\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.072439 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nsgqf\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.072460 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-scripts\") pod \"nova-cell1-conductor-db-sync-nsgqf\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.072502 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c94s\" (UniqueName: \"kubernetes.io/projected/4df8deca-75c5-40b2-a666-4b1c6050c273-kube-api-access-6c94s\") pod \"nova-cell1-conductor-db-sync-nsgqf\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.080210 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-scripts\") pod \"nova-cell1-conductor-db-sync-nsgqf\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.082546 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nsgqf\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.084276 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-config-data\") pod \"nova-cell1-conductor-db-sync-nsgqf\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.094687 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c94s\" (UniqueName: \"kubernetes.io/projected/4df8deca-75c5-40b2-a666-4b1c6050c273-kube-api-access-6c94s\") pod \"nova-cell1-conductor-db-sync-nsgqf\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.217424 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.230337 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e4653a43-d598-4252-9f50-f2d14521d44c","Type":"ContainerStarted","Data":"57fe8e9983c014234a4eb98fd44af7a6a02be5a7a02e1858a1365f644bc8f8b5"} Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.243651 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7bm5v" event={"ID":"d090d098-9d30-4ee0-89e6-a408f1340325","Type":"ContainerStarted","Data":"32140fba5d8f55d75c4386c2a272a4df60e1a31b54f45ba902474c0205177e5e"} Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.243697 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7bm5v" event={"ID":"d090d098-9d30-4ee0-89e6-a408f1340325","Type":"ContainerStarted","Data":"3b2f754cf197e6fc1ff32abe15bb48214099520cb9859194f72a1cbf876dc2e5"} Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.248676 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513edbab-8ccd-48ab-86e4-103e0dd3fc9e","Type":"ContainerStarted","Data":"486e87f17bc7859a07d6b42379941680493f38d682fa222c409152ce59985f6b"} Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.250532 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bddbc62d-5dcc-4ad3-9e74-adf443315395","Type":"ContainerStarted","Data":"6b189f099e5cb15aa2011252b1d95c786f98ec0506d2c12e98aa2561a618e2d9"} Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.263711 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-7bm5v" podStartSLOduration=2.263693178 podStartE2EDuration="2.263693178s" podCreationTimestamp="2025-12-11 05:34:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:34:37.263586585 +0000 UTC m=+1179.680933283" watchObservedRunningTime="2025-12-11 05:34:37.263693178 +0000 UTC m=+1179.681039876" Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.267542 4628 generic.go:334] "Generic (PLEG): container finished" podID="532398ad-d877-4e17-a86e-d403bc8a6678" containerID="c0d5305660274726e29d26e7cf1d7fb50542de05edc3e97a9c67bed0f7743aa8" exitCode=0 Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.267609 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" event={"ID":"532398ad-d877-4e17-a86e-d403bc8a6678","Type":"ContainerDied","Data":"c0d5305660274726e29d26e7cf1d7fb50542de05edc3e97a9c67bed0f7743aa8"} Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.267645 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" event={"ID":"532398ad-d877-4e17-a86e-d403bc8a6678","Type":"ContainerStarted","Data":"0af8af14f1977b99f7f75e2bc22b833a3d2c1a99dc96510621756d33bdf77d35"} Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.269267 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a509ccd-0386-4db9-9648-ad9a5618b726","Type":"ContainerStarted","Data":"e8fa5c04d9bb0db64bbf31f0e1238c47a545469b79bc784f70990f37a15c558e"} Dec 11 05:34:37 crc kubenswrapper[4628]: I1211 05:34:37.846353 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nsgqf"] Dec 11 05:34:38 crc kubenswrapper[4628]: I1211 05:34:38.299603 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" event={"ID":"532398ad-d877-4e17-a86e-d403bc8a6678","Type":"ContainerStarted","Data":"46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541"} Dec 11 05:34:38 crc kubenswrapper[4628]: I1211 05:34:38.301495 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:38 crc kubenswrapper[4628]: I1211 05:34:38.305891 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nsgqf" event={"ID":"4df8deca-75c5-40b2-a666-4b1c6050c273","Type":"ContainerStarted","Data":"a440815945b3800abc9b892b20207b075ba44b44f8e0f0c4be09ef266ff42391"} Dec 11 05:34:38 crc kubenswrapper[4628]: I1211 05:34:38.332115 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" podStartSLOduration=3.33198877 podStartE2EDuration="3.33198877s" podCreationTimestamp="2025-12-11 05:34:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:34:38.317123171 +0000 UTC m=+1180.734469869" watchObservedRunningTime="2025-12-11 05:34:38.33198877 +0000 UTC m=+1180.749335468" Dec 11 05:34:38 crc kubenswrapper[4628]: I1211 05:34:38.346984 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-nsgqf" podStartSLOduration=2.346962862 podStartE2EDuration="2.346962862s" podCreationTimestamp="2025-12-11 05:34:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:34:38.333909412 +0000 UTC m=+1180.751256120" watchObservedRunningTime="2025-12-11 05:34:38.346962862 +0000 UTC m=+1180.764309550" Dec 11 05:34:39 crc kubenswrapper[4628]: I1211 05:34:39.079022 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:39 crc kubenswrapper[4628]: I1211 05:34:39.129352 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 05:34:39 crc kubenswrapper[4628]: I1211 05:34:39.355562 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nsgqf" event={"ID":"4df8deca-75c5-40b2-a666-4b1c6050c273","Type":"ContainerStarted","Data":"e0c7e6e0b9f64661b8ee1a96199acb85c920ee37827d52abac1e94d14daf375a"} Dec 11 05:34:40 crc kubenswrapper[4628]: I1211 05:34:40.342495 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.393822 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a509ccd-0386-4db9-9648-ad9a5618b726","Type":"ContainerStarted","Data":"9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9"} Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.394280 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a509ccd-0386-4db9-9648-ad9a5618b726","Type":"ContainerStarted","Data":"4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e"} Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.394001 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2a509ccd-0386-4db9-9648-ad9a5618b726" containerName="nova-metadata-metadata" containerID="cri-o://9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9" gracePeriod=30 Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.393947 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2a509ccd-0386-4db9-9648-ad9a5618b726" containerName="nova-metadata-log" containerID="cri-o://4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e" gracePeriod=30 Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.396064 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e4653a43-d598-4252-9f50-f2d14521d44c","Type":"ContainerStarted","Data":"f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d"} Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.411351 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513edbab-8ccd-48ab-86e4-103e0dd3fc9e","Type":"ContainerStarted","Data":"b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca"} Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.411414 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513edbab-8ccd-48ab-86e4-103e0dd3fc9e","Type":"ContainerStarted","Data":"100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc"} Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.416563 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.573913623 podStartE2EDuration="7.416543991s" podCreationTimestamp="2025-12-11 05:34:35 +0000 UTC" firstStartedPulling="2025-12-11 05:34:36.547010162 +0000 UTC m=+1178.964356860" lastFinishedPulling="2025-12-11 05:34:41.38964053 +0000 UTC m=+1183.806987228" observedRunningTime="2025-12-11 05:34:42.413079268 +0000 UTC m=+1184.830425966" watchObservedRunningTime="2025-12-11 05:34:42.416543991 +0000 UTC m=+1184.833890689" Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.437366 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bddbc62d-5dcc-4ad3-9e74-adf443315395","Type":"ContainerStarted","Data":"520f93202cb4c3be86306ac0ef9a3f0ec65b385db5c541efe42cc93aa795bde3"} Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.437514 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="bddbc62d-5dcc-4ad3-9e74-adf443315395" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://520f93202cb4c3be86306ac0ef9a3f0ec65b385db5c541efe42cc93aa795bde3" gracePeriod=30 Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.470717 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.47298274 podStartE2EDuration="7.470695345s" podCreationTimestamp="2025-12-11 05:34:35 +0000 UTC" firstStartedPulling="2025-12-11 05:34:36.396259512 +0000 UTC m=+1178.813606210" lastFinishedPulling="2025-12-11 05:34:41.393972127 +0000 UTC m=+1183.811318815" observedRunningTime="2025-12-11 05:34:42.438758518 +0000 UTC m=+1184.856105216" watchObservedRunningTime="2025-12-11 05:34:42.470695345 +0000 UTC m=+1184.888042043" Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.474679 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.086701061 podStartE2EDuration="7.474660023s" podCreationTimestamp="2025-12-11 05:34:35 +0000 UTC" firstStartedPulling="2025-12-11 05:34:37.006512108 +0000 UTC m=+1179.423858806" lastFinishedPulling="2025-12-11 05:34:41.39447107 +0000 UTC m=+1183.811817768" observedRunningTime="2025-12-11 05:34:42.458328054 +0000 UTC m=+1184.875674742" watchObservedRunningTime="2025-12-11 05:34:42.474660023 +0000 UTC m=+1184.892006721" Dec 11 05:34:42 crc kubenswrapper[4628]: I1211 05:34:42.487988 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.763084714 podStartE2EDuration="7.48797075s" podCreationTimestamp="2025-12-11 05:34:35 +0000 UTC" firstStartedPulling="2025-12-11 05:34:36.66604496 +0000 UTC m=+1179.083391658" lastFinishedPulling="2025-12-11 05:34:41.390931006 +0000 UTC m=+1183.808277694" observedRunningTime="2025-12-11 05:34:42.477204301 +0000 UTC m=+1184.894550999" watchObservedRunningTime="2025-12-11 05:34:42.48797075 +0000 UTC m=+1184.905317448" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.360696 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.453616 4628 generic.go:334] "Generic (PLEG): container finished" podID="2a509ccd-0386-4db9-9648-ad9a5618b726" containerID="9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9" exitCode=0 Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.453885 4628 generic.go:334] "Generic (PLEG): container finished" podID="2a509ccd-0386-4db9-9648-ad9a5618b726" containerID="4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e" exitCode=143 Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.454300 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a509ccd-0386-4db9-9648-ad9a5618b726","Type":"ContainerDied","Data":"9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9"} Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.454341 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a509ccd-0386-4db9-9648-ad9a5618b726","Type":"ContainerDied","Data":"4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e"} Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.454352 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2a509ccd-0386-4db9-9648-ad9a5618b726","Type":"ContainerDied","Data":"e8fa5c04d9bb0db64bbf31f0e1238c47a545469b79bc784f70990f37a15c558e"} Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.454366 4628 scope.go:117] "RemoveContainer" containerID="9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.454482 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.476622 4628 scope.go:117] "RemoveContainer" containerID="4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.493764 4628 scope.go:117] "RemoveContainer" containerID="9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9" Dec 11 05:34:43 crc kubenswrapper[4628]: E1211 05:34:43.494997 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9\": container with ID starting with 9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9 not found: ID does not exist" containerID="9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.495033 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9"} err="failed to get container status \"9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9\": rpc error: code = NotFound desc = could not find container \"9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9\": container with ID starting with 9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9 not found: ID does not exist" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.495056 4628 scope.go:117] "RemoveContainer" containerID="4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e" Dec 11 05:34:43 crc kubenswrapper[4628]: E1211 05:34:43.495401 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e\": container with ID starting with 4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e not found: ID does not exist" containerID="4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.495425 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e"} err="failed to get container status \"4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e\": rpc error: code = NotFound desc = could not find container \"4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e\": container with ID starting with 4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e not found: ID does not exist" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.495441 4628 scope.go:117] "RemoveContainer" containerID="9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.495708 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9"} err="failed to get container status \"9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9\": rpc error: code = NotFound desc = could not find container \"9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9\": container with ID starting with 9609d3c5c4c2322b6e9b4331b88e4da56b9759be028f4df7bb20341c06c899d9 not found: ID does not exist" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.495728 4628 scope.go:117] "RemoveContainer" containerID="4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.496015 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e"} err="failed to get container status \"4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e\": rpc error: code = NotFound desc = could not find container \"4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e\": container with ID starting with 4445ffdec4649620dcfecd986c7878de95a14354f7fb0c68e5491d1728a1788e not found: ID does not exist" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.550837 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a509ccd-0386-4db9-9648-ad9a5618b726-combined-ca-bundle\") pod \"2a509ccd-0386-4db9-9648-ad9a5618b726\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.550932 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v72g7\" (UniqueName: \"kubernetes.io/projected/2a509ccd-0386-4db9-9648-ad9a5618b726-kube-api-access-v72g7\") pod \"2a509ccd-0386-4db9-9648-ad9a5618b726\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.551007 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a509ccd-0386-4db9-9648-ad9a5618b726-logs\") pod \"2a509ccd-0386-4db9-9648-ad9a5618b726\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.551460 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a509ccd-0386-4db9-9648-ad9a5618b726-logs" (OuterVolumeSpecName: "logs") pod "2a509ccd-0386-4db9-9648-ad9a5618b726" (UID: "2a509ccd-0386-4db9-9648-ad9a5618b726"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.551757 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a509ccd-0386-4db9-9648-ad9a5618b726-config-data\") pod \"2a509ccd-0386-4db9-9648-ad9a5618b726\" (UID: \"2a509ccd-0386-4db9-9648-ad9a5618b726\") " Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.552231 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a509ccd-0386-4db9-9648-ad9a5618b726-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.572867 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a509ccd-0386-4db9-9648-ad9a5618b726-kube-api-access-v72g7" (OuterVolumeSpecName: "kube-api-access-v72g7") pod "2a509ccd-0386-4db9-9648-ad9a5618b726" (UID: "2a509ccd-0386-4db9-9648-ad9a5618b726"). InnerVolumeSpecName "kube-api-access-v72g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.581924 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a509ccd-0386-4db9-9648-ad9a5618b726-config-data" (OuterVolumeSpecName: "config-data") pod "2a509ccd-0386-4db9-9648-ad9a5618b726" (UID: "2a509ccd-0386-4db9-9648-ad9a5618b726"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.586808 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a509ccd-0386-4db9-9648-ad9a5618b726-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a509ccd-0386-4db9-9648-ad9a5618b726" (UID: "2a509ccd-0386-4db9-9648-ad9a5618b726"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.654047 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a509ccd-0386-4db9-9648-ad9a5618b726-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.654271 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a509ccd-0386-4db9-9648-ad9a5618b726-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.654350 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v72g7\" (UniqueName: \"kubernetes.io/projected/2a509ccd-0386-4db9-9648-ad9a5618b726-kube-api-access-v72g7\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.783197 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.794591 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.810679 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:43 crc kubenswrapper[4628]: E1211 05:34:43.811119 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a509ccd-0386-4db9-9648-ad9a5618b726" containerName="nova-metadata-log" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.811136 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a509ccd-0386-4db9-9648-ad9a5618b726" containerName="nova-metadata-log" Dec 11 05:34:43 crc kubenswrapper[4628]: E1211 05:34:43.811176 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a509ccd-0386-4db9-9648-ad9a5618b726" containerName="nova-metadata-metadata" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.811183 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a509ccd-0386-4db9-9648-ad9a5618b726" containerName="nova-metadata-metadata" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.811352 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a509ccd-0386-4db9-9648-ad9a5618b726" containerName="nova-metadata-log" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.811375 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a509ccd-0386-4db9-9648-ad9a5618b726" containerName="nova-metadata-metadata" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.812295 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.817014 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.817791 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.832304 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.902520 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a509ccd-0386-4db9-9648-ad9a5618b726" path="/var/lib/kubelet/pods/2a509ccd-0386-4db9-9648-ad9a5618b726/volumes" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.959585 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-config-data\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.959829 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de05727e-cb6b-4794-a87a-666d09d1d786-logs\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.959952 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.960075 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:43 crc kubenswrapper[4628]: I1211 05:34:43.960119 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whv7x\" (UniqueName: \"kubernetes.io/projected/de05727e-cb6b-4794-a87a-666d09d1d786-kube-api-access-whv7x\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:44 crc kubenswrapper[4628]: I1211 05:34:44.061756 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-config-data\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:44 crc kubenswrapper[4628]: I1211 05:34:44.061835 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de05727e-cb6b-4794-a87a-666d09d1d786-logs\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:44 crc kubenswrapper[4628]: I1211 05:34:44.061880 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:44 crc kubenswrapper[4628]: I1211 05:34:44.061920 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:44 crc kubenswrapper[4628]: I1211 05:34:44.061950 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whv7x\" (UniqueName: \"kubernetes.io/projected/de05727e-cb6b-4794-a87a-666d09d1d786-kube-api-access-whv7x\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:44 crc kubenswrapper[4628]: I1211 05:34:44.062721 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de05727e-cb6b-4794-a87a-666d09d1d786-logs\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:44 crc kubenswrapper[4628]: I1211 05:34:44.067787 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-config-data\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:44 crc kubenswrapper[4628]: I1211 05:34:44.069371 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:44 crc kubenswrapper[4628]: I1211 05:34:44.078518 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:44 crc kubenswrapper[4628]: I1211 05:34:44.079726 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whv7x\" (UniqueName: \"kubernetes.io/projected/de05727e-cb6b-4794-a87a-666d09d1d786-kube-api-access-whv7x\") pod \"nova-metadata-0\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " pod="openstack/nova-metadata-0" Dec 11 05:34:44 crc kubenswrapper[4628]: I1211 05:34:44.130238 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:34:44 crc kubenswrapper[4628]: I1211 05:34:44.593977 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:45 crc kubenswrapper[4628]: I1211 05:34:45.489242 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de05727e-cb6b-4794-a87a-666d09d1d786","Type":"ContainerStarted","Data":"8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf"} Dec 11 05:34:45 crc kubenswrapper[4628]: I1211 05:34:45.489790 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de05727e-cb6b-4794-a87a-666d09d1d786","Type":"ContainerStarted","Data":"e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e"} Dec 11 05:34:45 crc kubenswrapper[4628]: I1211 05:34:45.489946 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de05727e-cb6b-4794-a87a-666d09d1d786","Type":"ContainerStarted","Data":"959f0f728d9062960663dabfe6abc4e91668f033e133a7ac085d439cfd58c3af"} Dec 11 05:34:45 crc kubenswrapper[4628]: I1211 05:34:45.693709 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 05:34:45 crc kubenswrapper[4628]: I1211 05:34:45.693749 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 05:34:45 crc kubenswrapper[4628]: I1211 05:34:45.947597 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.137072 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.160178 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.160156142 podStartE2EDuration="3.160156142s" podCreationTimestamp="2025-12-11 05:34:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:34:45.512536042 +0000 UTC m=+1187.929882780" watchObservedRunningTime="2025-12-11 05:34:46.160156142 +0000 UTC m=+1188.577502850" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.223971 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-bqcv5"] Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.224559 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" podUID="46f03c63-6732-4757-8685-0742a4c25590" containerName="dnsmasq-dns" containerID="cri-o://592581898fa37bdfd613617f3d99d3153ec464d171002c628d5d24835a4b33e5" gracePeriod=10 Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.240057 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.240101 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.277395 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.509522 4628 generic.go:334] "Generic (PLEG): container finished" podID="46f03c63-6732-4757-8685-0742a4c25590" containerID="592581898fa37bdfd613617f3d99d3153ec464d171002c628d5d24835a4b33e5" exitCode=0 Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.509586 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" event={"ID":"46f03c63-6732-4757-8685-0742a4c25590","Type":"ContainerDied","Data":"592581898fa37bdfd613617f3d99d3153ec464d171002c628d5d24835a4b33e5"} Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.511351 4628 generic.go:334] "Generic (PLEG): container finished" podID="4df8deca-75c5-40b2-a666-4b1c6050c273" containerID="e0c7e6e0b9f64661b8ee1a96199acb85c920ee37827d52abac1e94d14daf375a" exitCode=0 Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.511434 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nsgqf" event={"ID":"4df8deca-75c5-40b2-a666-4b1c6050c273","Type":"ContainerDied","Data":"e0c7e6e0b9f64661b8ee1a96199acb85c920ee37827d52abac1e94d14daf375a"} Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.513780 4628 generic.go:334] "Generic (PLEG): container finished" podID="d090d098-9d30-4ee0-89e6-a408f1340325" containerID="32140fba5d8f55d75c4386c2a272a4df60e1a31b54f45ba902474c0205177e5e" exitCode=0 Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.514519 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7bm5v" event={"ID":"d090d098-9d30-4ee0-89e6-a408f1340325","Type":"ContainerDied","Data":"32140fba5d8f55d75c4386c2a272a4df60e1a31b54f45ba902474c0205177e5e"} Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.570043 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.775775 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.777360 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.777585 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.909729 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-config\") pod \"46f03c63-6732-4757-8685-0742a4c25590\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.910159 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkfzj\" (UniqueName: \"kubernetes.io/projected/46f03c63-6732-4757-8685-0742a4c25590-kube-api-access-mkfzj\") pod \"46f03c63-6732-4757-8685-0742a4c25590\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.910239 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-dns-svc\") pod \"46f03c63-6732-4757-8685-0742a4c25590\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.910313 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-ovsdbserver-nb\") pod \"46f03c63-6732-4757-8685-0742a4c25590\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.910371 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-dns-swift-storage-0\") pod \"46f03c63-6732-4757-8685-0742a4c25590\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.910428 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-ovsdbserver-sb\") pod \"46f03c63-6732-4757-8685-0742a4c25590\" (UID: \"46f03c63-6732-4757-8685-0742a4c25590\") " Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.918994 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46f03c63-6732-4757-8685-0742a4c25590-kube-api-access-mkfzj" (OuterVolumeSpecName: "kube-api-access-mkfzj") pod "46f03c63-6732-4757-8685-0742a4c25590" (UID: "46f03c63-6732-4757-8685-0742a4c25590"). InnerVolumeSpecName "kube-api-access-mkfzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.970116 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46f03c63-6732-4757-8685-0742a4c25590" (UID: "46f03c63-6732-4757-8685-0742a4c25590"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.973864 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46f03c63-6732-4757-8685-0742a4c25590" (UID: "46f03c63-6732-4757-8685-0742a4c25590"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:34:46 crc kubenswrapper[4628]: I1211 05:34:46.995216 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "46f03c63-6732-4757-8685-0742a4c25590" (UID: "46f03c63-6732-4757-8685-0742a4c25590"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.003071 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-config" (OuterVolumeSpecName: "config") pod "46f03c63-6732-4757-8685-0742a4c25590" (UID: "46f03c63-6732-4757-8685-0742a4c25590"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.020529 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkfzj\" (UniqueName: \"kubernetes.io/projected/46f03c63-6732-4757-8685-0742a4c25590-kube-api-access-mkfzj\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.020558 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.020577 4628 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.020586 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.020597 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.042701 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46f03c63-6732-4757-8685-0742a4c25590" (UID: "46f03c63-6732-4757-8685-0742a4c25590"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.121698 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46f03c63-6732-4757-8685-0742a4c25590-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.531153 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" event={"ID":"46f03c63-6732-4757-8685-0742a4c25590","Type":"ContainerDied","Data":"3bfc7244561f6336396b09f287fb1a31fb76ec873a0d7c2b9647046761e0f002"} Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.531279 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-bqcv5" Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.531382 4628 scope.go:117] "RemoveContainer" containerID="592581898fa37bdfd613617f3d99d3153ec464d171002c628d5d24835a4b33e5" Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.575099 4628 scope.go:117] "RemoveContainer" containerID="25a17ebd1255ab860e6b40b219202f123e660645ac711bac9310f5fc4d069f32" Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.575253 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-bqcv5"] Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.586527 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-bqcv5"] Dec 11 05:34:47 crc kubenswrapper[4628]: I1211 05:34:47.940672 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f03c63-6732-4757-8685-0742a4c25590" path="/var/lib/kubelet/pods/46f03c63-6732-4757-8685-0742a4c25590/volumes" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.069475 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.077630 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.255440 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-combined-ca-bundle\") pod \"d090d098-9d30-4ee0-89e6-a408f1340325\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.255512 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-config-data\") pod \"d090d098-9d30-4ee0-89e6-a408f1340325\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.255545 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c94s\" (UniqueName: \"kubernetes.io/projected/4df8deca-75c5-40b2-a666-4b1c6050c273-kube-api-access-6c94s\") pod \"4df8deca-75c5-40b2-a666-4b1c6050c273\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.255597 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-combined-ca-bundle\") pod \"4df8deca-75c5-40b2-a666-4b1c6050c273\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.255634 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skgzv\" (UniqueName: \"kubernetes.io/projected/d090d098-9d30-4ee0-89e6-a408f1340325-kube-api-access-skgzv\") pod \"d090d098-9d30-4ee0-89e6-a408f1340325\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.255728 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-scripts\") pod \"4df8deca-75c5-40b2-a666-4b1c6050c273\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.255764 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-config-data\") pod \"4df8deca-75c5-40b2-a666-4b1c6050c273\" (UID: \"4df8deca-75c5-40b2-a666-4b1c6050c273\") " Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.255809 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-scripts\") pod \"d090d098-9d30-4ee0-89e6-a408f1340325\" (UID: \"d090d098-9d30-4ee0-89e6-a408f1340325\") " Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.261719 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4df8deca-75c5-40b2-a666-4b1c6050c273-kube-api-access-6c94s" (OuterVolumeSpecName: "kube-api-access-6c94s") pod "4df8deca-75c5-40b2-a666-4b1c6050c273" (UID: "4df8deca-75c5-40b2-a666-4b1c6050c273"). InnerVolumeSpecName "kube-api-access-6c94s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.264684 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d090d098-9d30-4ee0-89e6-a408f1340325-kube-api-access-skgzv" (OuterVolumeSpecName: "kube-api-access-skgzv") pod "d090d098-9d30-4ee0-89e6-a408f1340325" (UID: "d090d098-9d30-4ee0-89e6-a408f1340325"). InnerVolumeSpecName "kube-api-access-skgzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.264974 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-scripts" (OuterVolumeSpecName: "scripts") pod "4df8deca-75c5-40b2-a666-4b1c6050c273" (UID: "4df8deca-75c5-40b2-a666-4b1c6050c273"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.279066 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-scripts" (OuterVolumeSpecName: "scripts") pod "d090d098-9d30-4ee0-89e6-a408f1340325" (UID: "d090d098-9d30-4ee0-89e6-a408f1340325"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.300527 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-config-data" (OuterVolumeSpecName: "config-data") pod "d090d098-9d30-4ee0-89e6-a408f1340325" (UID: "d090d098-9d30-4ee0-89e6-a408f1340325"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.306723 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-config-data" (OuterVolumeSpecName: "config-data") pod "4df8deca-75c5-40b2-a666-4b1c6050c273" (UID: "4df8deca-75c5-40b2-a666-4b1c6050c273"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.307959 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4df8deca-75c5-40b2-a666-4b1c6050c273" (UID: "4df8deca-75c5-40b2-a666-4b1c6050c273"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.320113 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d090d098-9d30-4ee0-89e6-a408f1340325" (UID: "d090d098-9d30-4ee0-89e6-a408f1340325"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.358473 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.358513 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.358556 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c94s\" (UniqueName: \"kubernetes.io/projected/4df8deca-75c5-40b2-a666-4b1c6050c273-kube-api-access-6c94s\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.358572 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.358585 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skgzv\" (UniqueName: \"kubernetes.io/projected/d090d098-9d30-4ee0-89e6-a408f1340325-kube-api-access-skgzv\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.358595 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.358606 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df8deca-75c5-40b2-a666-4b1c6050c273-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.358617 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d090d098-9d30-4ee0-89e6-a408f1340325-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.544202 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7bm5v" event={"ID":"d090d098-9d30-4ee0-89e6-a408f1340325","Type":"ContainerDied","Data":"3b2f754cf197e6fc1ff32abe15bb48214099520cb9859194f72a1cbf876dc2e5"} Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.544237 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b2f754cf197e6fc1ff32abe15bb48214099520cb9859194f72a1cbf876dc2e5" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.544292 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7bm5v" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.589147 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nsgqf" event={"ID":"4df8deca-75c5-40b2-a666-4b1c6050c273","Type":"ContainerDied","Data":"a440815945b3800abc9b892b20207b075ba44b44f8e0f0c4be09ef266ff42391"} Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.589192 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a440815945b3800abc9b892b20207b075ba44b44f8e0f0c4be09ef266ff42391" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.589334 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nsgqf" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.648592 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 11 05:34:48 crc kubenswrapper[4628]: E1211 05:34:48.649012 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f03c63-6732-4757-8685-0742a4c25590" containerName="init" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.649028 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f03c63-6732-4757-8685-0742a4c25590" containerName="init" Dec 11 05:34:48 crc kubenswrapper[4628]: E1211 05:34:48.649044 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d090d098-9d30-4ee0-89e6-a408f1340325" containerName="nova-manage" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.649050 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="d090d098-9d30-4ee0-89e6-a408f1340325" containerName="nova-manage" Dec 11 05:34:48 crc kubenswrapper[4628]: E1211 05:34:48.649058 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f03c63-6732-4757-8685-0742a4c25590" containerName="dnsmasq-dns" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.649064 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f03c63-6732-4757-8685-0742a4c25590" containerName="dnsmasq-dns" Dec 11 05:34:48 crc kubenswrapper[4628]: E1211 05:34:48.649090 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4df8deca-75c5-40b2-a666-4b1c6050c273" containerName="nova-cell1-conductor-db-sync" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.649096 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="4df8deca-75c5-40b2-a666-4b1c6050c273" containerName="nova-cell1-conductor-db-sync" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.649324 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="4df8deca-75c5-40b2-a666-4b1c6050c273" containerName="nova-cell1-conductor-db-sync" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.649349 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="d090d098-9d30-4ee0-89e6-a408f1340325" containerName="nova-manage" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.649362 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f03c63-6732-4757-8685-0742a4c25590" containerName="dnsmasq-dns" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.650051 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.656023 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.663338 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.738650 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.744100 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" containerName="nova-api-log" containerID="cri-o://100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc" gracePeriod=30 Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.744506 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" containerName="nova-api-api" containerID="cri-o://b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca" gracePeriod=30 Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.763210 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.763472 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e4653a43-d598-4252-9f50-f2d14521d44c" containerName="nova-scheduler-scheduler" containerID="cri-o://f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d" gracePeriod=30 Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.768320 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ebe2ed-1819-4bd2-9f26-e8f392645684-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"48ebe2ed-1819-4bd2-9f26-e8f392645684\") " pod="openstack/nova-cell1-conductor-0" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.768431 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ebe2ed-1819-4bd2-9f26-e8f392645684-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"48ebe2ed-1819-4bd2-9f26-e8f392645684\") " pod="openstack/nova-cell1-conductor-0" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.768464 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22lwv\" (UniqueName: \"kubernetes.io/projected/48ebe2ed-1819-4bd2-9f26-e8f392645684-kube-api-access-22lwv\") pod \"nova-cell1-conductor-0\" (UID: \"48ebe2ed-1819-4bd2-9f26-e8f392645684\") " pod="openstack/nova-cell1-conductor-0" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.790275 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.790490 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="de05727e-cb6b-4794-a87a-666d09d1d786" containerName="nova-metadata-log" containerID="cri-o://e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e" gracePeriod=30 Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.790595 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="de05727e-cb6b-4794-a87a-666d09d1d786" containerName="nova-metadata-metadata" containerID="cri-o://8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf" gracePeriod=30 Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.869893 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ebe2ed-1819-4bd2-9f26-e8f392645684-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"48ebe2ed-1819-4bd2-9f26-e8f392645684\") " pod="openstack/nova-cell1-conductor-0" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.870044 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ebe2ed-1819-4bd2-9f26-e8f392645684-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"48ebe2ed-1819-4bd2-9f26-e8f392645684\") " pod="openstack/nova-cell1-conductor-0" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.870081 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22lwv\" (UniqueName: \"kubernetes.io/projected/48ebe2ed-1819-4bd2-9f26-e8f392645684-kube-api-access-22lwv\") pod \"nova-cell1-conductor-0\" (UID: \"48ebe2ed-1819-4bd2-9f26-e8f392645684\") " pod="openstack/nova-cell1-conductor-0" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.874106 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ebe2ed-1819-4bd2-9f26-e8f392645684-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"48ebe2ed-1819-4bd2-9f26-e8f392645684\") " pod="openstack/nova-cell1-conductor-0" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.874642 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ebe2ed-1819-4bd2-9f26-e8f392645684-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"48ebe2ed-1819-4bd2-9f26-e8f392645684\") " pod="openstack/nova-cell1-conductor-0" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.891980 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22lwv\" (UniqueName: \"kubernetes.io/projected/48ebe2ed-1819-4bd2-9f26-e8f392645684-kube-api-access-22lwv\") pod \"nova-cell1-conductor-0\" (UID: \"48ebe2ed-1819-4bd2-9f26-e8f392645684\") " pod="openstack/nova-cell1-conductor-0" Dec 11 05:34:48 crc kubenswrapper[4628]: I1211 05:34:48.984022 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.130517 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.130573 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.306617 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.480398 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-config-data\") pod \"de05727e-cb6b-4794-a87a-666d09d1d786\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.480483 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-nova-metadata-tls-certs\") pod \"de05727e-cb6b-4794-a87a-666d09d1d786\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.480545 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de05727e-cb6b-4794-a87a-666d09d1d786-logs\") pod \"de05727e-cb6b-4794-a87a-666d09d1d786\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.480638 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-combined-ca-bundle\") pod \"de05727e-cb6b-4794-a87a-666d09d1d786\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.480727 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whv7x\" (UniqueName: \"kubernetes.io/projected/de05727e-cb6b-4794-a87a-666d09d1d786-kube-api-access-whv7x\") pod \"de05727e-cb6b-4794-a87a-666d09d1d786\" (UID: \"de05727e-cb6b-4794-a87a-666d09d1d786\") " Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.481172 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de05727e-cb6b-4794-a87a-666d09d1d786-logs" (OuterVolumeSpecName: "logs") pod "de05727e-cb6b-4794-a87a-666d09d1d786" (UID: "de05727e-cb6b-4794-a87a-666d09d1d786"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.485464 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de05727e-cb6b-4794-a87a-666d09d1d786-kube-api-access-whv7x" (OuterVolumeSpecName: "kube-api-access-whv7x") pod "de05727e-cb6b-4794-a87a-666d09d1d786" (UID: "de05727e-cb6b-4794-a87a-666d09d1d786"). InnerVolumeSpecName "kube-api-access-whv7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.518392 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-config-data" (OuterVolumeSpecName: "config-data") pod "de05727e-cb6b-4794-a87a-666d09d1d786" (UID: "de05727e-cb6b-4794-a87a-666d09d1d786"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.518864 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de05727e-cb6b-4794-a87a-666d09d1d786" (UID: "de05727e-cb6b-4794-a87a-666d09d1d786"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.540365 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 11 05:34:49 crc kubenswrapper[4628]: W1211 05:34:49.547135 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48ebe2ed_1819_4bd2_9f26_e8f392645684.slice/crio-6e8d8d8a153f0cde4d988b7d45b4f9819c66745a945cc5b89e8588c6de45aef7 WatchSource:0}: Error finding container 6e8d8d8a153f0cde4d988b7d45b4f9819c66745a945cc5b89e8588c6de45aef7: Status 404 returned error can't find the container with id 6e8d8d8a153f0cde4d988b7d45b4f9819c66745a945cc5b89e8588c6de45aef7 Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.555656 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "de05727e-cb6b-4794-a87a-666d09d1d786" (UID: "de05727e-cb6b-4794-a87a-666d09d1d786"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.583052 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whv7x\" (UniqueName: \"kubernetes.io/projected/de05727e-cb6b-4794-a87a-666d09d1d786-kube-api-access-whv7x\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.583281 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.583386 4628 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.583461 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de05727e-cb6b-4794-a87a-666d09d1d786-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.583534 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de05727e-cb6b-4794-a87a-666d09d1d786-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.600534 4628 generic.go:334] "Generic (PLEG): container finished" podID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" containerID="100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc" exitCode=143 Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.600607 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513edbab-8ccd-48ab-86e4-103e0dd3fc9e","Type":"ContainerDied","Data":"100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc"} Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.602605 4628 generic.go:334] "Generic (PLEG): container finished" podID="de05727e-cb6b-4794-a87a-666d09d1d786" containerID="8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf" exitCode=0 Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.602638 4628 generic.go:334] "Generic (PLEG): container finished" podID="de05727e-cb6b-4794-a87a-666d09d1d786" containerID="e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e" exitCode=143 Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.602690 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de05727e-cb6b-4794-a87a-666d09d1d786","Type":"ContainerDied","Data":"8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf"} Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.602718 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de05727e-cb6b-4794-a87a-666d09d1d786","Type":"ContainerDied","Data":"e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e"} Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.602728 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de05727e-cb6b-4794-a87a-666d09d1d786","Type":"ContainerDied","Data":"959f0f728d9062960663dabfe6abc4e91668f033e133a7ac085d439cfd58c3af"} Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.602744 4628 scope.go:117] "RemoveContainer" containerID="8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.602889 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.608488 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"48ebe2ed-1819-4bd2-9f26-e8f392645684","Type":"ContainerStarted","Data":"6e8d8d8a153f0cde4d988b7d45b4f9819c66745a945cc5b89e8588c6de45aef7"} Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.653390 4628 scope.go:117] "RemoveContainer" containerID="e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.680477 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.698867 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.721160 4628 scope.go:117] "RemoveContainer" containerID="8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf" Dec 11 05:34:49 crc kubenswrapper[4628]: E1211 05:34:49.743414 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf\": container with ID starting with 8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf not found: ID does not exist" containerID="8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.743472 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf"} err="failed to get container status \"8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf\": rpc error: code = NotFound desc = could not find container \"8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf\": container with ID starting with 8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf not found: ID does not exist" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.743508 4628 scope.go:117] "RemoveContainer" containerID="e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e" Dec 11 05:34:49 crc kubenswrapper[4628]: E1211 05:34:49.745532 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e\": container with ID starting with e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e not found: ID does not exist" containerID="e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.745561 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e"} err="failed to get container status \"e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e\": rpc error: code = NotFound desc = could not find container \"e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e\": container with ID starting with e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e not found: ID does not exist" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.745578 4628 scope.go:117] "RemoveContainer" containerID="8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.755372 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf"} err="failed to get container status \"8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf\": rpc error: code = NotFound desc = could not find container \"8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf\": container with ID starting with 8a0c940a224116363f8ac19c65e108352aee09634aef430b38379bb8b04ae1cf not found: ID does not exist" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.755412 4628 scope.go:117] "RemoveContainer" containerID="e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.756926 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e"} err="failed to get container status \"e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e\": rpc error: code = NotFound desc = could not find container \"e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e\": container with ID starting with e7f6a880d07e2d0c951cf95e6e9b03b013c7dfb216a35fda53b02e6af1bcac1e not found: ID does not exist" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.775799 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:49 crc kubenswrapper[4628]: E1211 05:34:49.776581 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de05727e-cb6b-4794-a87a-666d09d1d786" containerName="nova-metadata-metadata" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.776595 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="de05727e-cb6b-4794-a87a-666d09d1d786" containerName="nova-metadata-metadata" Dec 11 05:34:49 crc kubenswrapper[4628]: E1211 05:34:49.776613 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de05727e-cb6b-4794-a87a-666d09d1d786" containerName="nova-metadata-log" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.776621 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="de05727e-cb6b-4794-a87a-666d09d1d786" containerName="nova-metadata-log" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.776984 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="de05727e-cb6b-4794-a87a-666d09d1d786" containerName="nova-metadata-metadata" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.777025 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="de05727e-cb6b-4794-a87a-666d09d1d786" containerName="nova-metadata-log" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.799957 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.799854 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.803379 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.803481 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.901396 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2stdq\" (UniqueName: \"kubernetes.io/projected/40242d08-6531-48a3-8df9-5ee8b069011d-kube-api-access-2stdq\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.901751 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.901778 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40242d08-6531-48a3-8df9-5ee8b069011d-logs\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.901801 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.901837 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-config-data\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:49 crc kubenswrapper[4628]: I1211 05:34:49.910605 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de05727e-cb6b-4794-a87a-666d09d1d786" path="/var/lib/kubelet/pods/de05727e-cb6b-4794-a87a-666d09d1d786/volumes" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.003552 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40242d08-6531-48a3-8df9-5ee8b069011d-logs\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.003605 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.003661 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-config-data\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.003804 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2stdq\" (UniqueName: \"kubernetes.io/projected/40242d08-6531-48a3-8df9-5ee8b069011d-kube-api-access-2stdq\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.003896 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.006068 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40242d08-6531-48a3-8df9-5ee8b069011d-logs\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.011085 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.015559 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.022083 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2stdq\" (UniqueName: \"kubernetes.io/projected/40242d08-6531-48a3-8df9-5ee8b069011d-kube-api-access-2stdq\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.029761 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-config-data\") pod \"nova-metadata-0\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " pod="openstack/nova-metadata-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.138357 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.141694 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.340474 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4653a43-d598-4252-9f50-f2d14521d44c-config-data\") pod \"e4653a43-d598-4252-9f50-f2d14521d44c\" (UID: \"e4653a43-d598-4252-9f50-f2d14521d44c\") " Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.340907 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4653a43-d598-4252-9f50-f2d14521d44c-combined-ca-bundle\") pod \"e4653a43-d598-4252-9f50-f2d14521d44c\" (UID: \"e4653a43-d598-4252-9f50-f2d14521d44c\") " Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.340970 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnmdr\" (UniqueName: \"kubernetes.io/projected/e4653a43-d598-4252-9f50-f2d14521d44c-kube-api-access-tnmdr\") pod \"e4653a43-d598-4252-9f50-f2d14521d44c\" (UID: \"e4653a43-d598-4252-9f50-f2d14521d44c\") " Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.359308 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4653a43-d598-4252-9f50-f2d14521d44c-kube-api-access-tnmdr" (OuterVolumeSpecName: "kube-api-access-tnmdr") pod "e4653a43-d598-4252-9f50-f2d14521d44c" (UID: "e4653a43-d598-4252-9f50-f2d14521d44c"). InnerVolumeSpecName "kube-api-access-tnmdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.384430 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4653a43-d598-4252-9f50-f2d14521d44c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4653a43-d598-4252-9f50-f2d14521d44c" (UID: "e4653a43-d598-4252-9f50-f2d14521d44c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.427464 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4653a43-d598-4252-9f50-f2d14521d44c-config-data" (OuterVolumeSpecName: "config-data") pod "e4653a43-d598-4252-9f50-f2d14521d44c" (UID: "e4653a43-d598-4252-9f50-f2d14521d44c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.442887 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4653a43-d598-4252-9f50-f2d14521d44c-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.442915 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4653a43-d598-4252-9f50-f2d14521d44c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.442927 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnmdr\" (UniqueName: \"kubernetes.io/projected/e4653a43-d598-4252-9f50-f2d14521d44c-kube-api-access-tnmdr\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.623637 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"48ebe2ed-1819-4bd2-9f26-e8f392645684","Type":"ContainerStarted","Data":"4e9b67ce80b5af768c5ed24f748870d5077a02bb0b80543a6e123de77483bbd4"} Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.623957 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.625951 4628 generic.go:334] "Generic (PLEG): container finished" podID="e4653a43-d598-4252-9f50-f2d14521d44c" containerID="f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d" exitCode=0 Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.625981 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e4653a43-d598-4252-9f50-f2d14521d44c","Type":"ContainerDied","Data":"f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d"} Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.625998 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e4653a43-d598-4252-9f50-f2d14521d44c","Type":"ContainerDied","Data":"57fe8e9983c014234a4eb98fd44af7a6a02be5a7a02e1858a1365f644bc8f8b5"} Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.626016 4628 scope.go:117] "RemoveContainer" containerID="f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.626102 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.653417 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.653400283 podStartE2EDuration="2.653400283s" podCreationTimestamp="2025-12-11 05:34:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:34:50.6472998 +0000 UTC m=+1193.064646488" watchObservedRunningTime="2025-12-11 05:34:50.653400283 +0000 UTC m=+1193.070746981" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.671400 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.673621 4628 scope.go:117] "RemoveContainer" containerID="f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d" Dec 11 05:34:50 crc kubenswrapper[4628]: E1211 05:34:50.675353 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d\": container with ID starting with f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d not found: ID does not exist" containerID="f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.675403 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d"} err="failed to get container status \"f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d\": rpc error: code = NotFound desc = could not find container \"f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d\": container with ID starting with f3997606b95dd7dacd1288015caf1442a40850ca9e905c9f80aa8d6b8166a48d not found: ID does not exist" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.705906 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.716107 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:34:50 crc kubenswrapper[4628]: E1211 05:34:50.716595 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4653a43-d598-4252-9f50-f2d14521d44c" containerName="nova-scheduler-scheduler" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.716615 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4653a43-d598-4252-9f50-f2d14521d44c" containerName="nova-scheduler-scheduler" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.717059 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4653a43-d598-4252-9f50-f2d14521d44c" containerName="nova-scheduler-scheduler" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.717698 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.721345 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.726901 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.754856 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.755158 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-config-data\") pod \"nova-scheduler-0\" (UID: \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.755302 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knjj5\" (UniqueName: \"kubernetes.io/projected/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-kube-api-access-knjj5\") pod \"nova-scheduler-0\" (UID: \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.862706 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knjj5\" (UniqueName: \"kubernetes.io/projected/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-kube-api-access-knjj5\") pod \"nova-scheduler-0\" (UID: \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.862824 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.863006 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-config-data\") pod \"nova-scheduler-0\" (UID: \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.867684 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-config-data\") pod \"nova-scheduler-0\" (UID: \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.876585 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.893758 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knjj5\" (UniqueName: \"kubernetes.io/projected/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-kube-api-access-knjj5\") pod \"nova-scheduler-0\" (UID: \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\") " pod="openstack/nova-scheduler-0" Dec 11 05:34:50 crc kubenswrapper[4628]: I1211 05:34:50.923855 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:34:51 crc kubenswrapper[4628]: I1211 05:34:51.047477 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 05:34:51 crc kubenswrapper[4628]: I1211 05:34:51.535615 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:34:51 crc kubenswrapper[4628]: W1211 05:34:51.541446 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30bca213_0fd3_4ac3_b075_f3deb4b54cfd.slice/crio-10d51de80aeebbabeebd649708416a2e0e2c995d4b092ea0c5e69e90ad9f3b65 WatchSource:0}: Error finding container 10d51de80aeebbabeebd649708416a2e0e2c995d4b092ea0c5e69e90ad9f3b65: Status 404 returned error can't find the container with id 10d51de80aeebbabeebd649708416a2e0e2c995d4b092ea0c5e69e90ad9f3b65 Dec 11 05:34:51 crc kubenswrapper[4628]: I1211 05:34:51.637430 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30bca213-0fd3-4ac3-b075-f3deb4b54cfd","Type":"ContainerStarted","Data":"10d51de80aeebbabeebd649708416a2e0e2c995d4b092ea0c5e69e90ad9f3b65"} Dec 11 05:34:51 crc kubenswrapper[4628]: I1211 05:34:51.643249 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40242d08-6531-48a3-8df9-5ee8b069011d","Type":"ContainerStarted","Data":"80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4"} Dec 11 05:34:51 crc kubenswrapper[4628]: I1211 05:34:51.643389 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40242d08-6531-48a3-8df9-5ee8b069011d","Type":"ContainerStarted","Data":"b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c"} Dec 11 05:34:51 crc kubenswrapper[4628]: I1211 05:34:51.643453 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40242d08-6531-48a3-8df9-5ee8b069011d","Type":"ContainerStarted","Data":"6730b1b7e4f4e88bd92530fc88d7200e7081365b07ec5b47f6aaca0832c22abc"} Dec 11 05:34:51 crc kubenswrapper[4628]: I1211 05:34:51.902218 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4653a43-d598-4252-9f50-f2d14521d44c" path="/var/lib/kubelet/pods/e4653a43-d598-4252-9f50-f2d14521d44c/volumes" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.284408 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.308128 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.308113161 podStartE2EDuration="3.308113161s" podCreationTimestamp="2025-12-11 05:34:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:34:51.676238114 +0000 UTC m=+1194.093584802" watchObservedRunningTime="2025-12-11 05:34:52.308113161 +0000 UTC m=+1194.725459859" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.398328 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-logs\") pod \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.398604 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-config-data\") pod \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.398786 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cqht\" (UniqueName: \"kubernetes.io/projected/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-kube-api-access-7cqht\") pod \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.398937 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-combined-ca-bundle\") pod \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\" (UID: \"513edbab-8ccd-48ab-86e4-103e0dd3fc9e\") " Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.400272 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-logs" (OuterVolumeSpecName: "logs") pod "513edbab-8ccd-48ab-86e4-103e0dd3fc9e" (UID: "513edbab-8ccd-48ab-86e4-103e0dd3fc9e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.406893 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-kube-api-access-7cqht" (OuterVolumeSpecName: "kube-api-access-7cqht") pod "513edbab-8ccd-48ab-86e4-103e0dd3fc9e" (UID: "513edbab-8ccd-48ab-86e4-103e0dd3fc9e"). InnerVolumeSpecName "kube-api-access-7cqht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.432750 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-config-data" (OuterVolumeSpecName: "config-data") pod "513edbab-8ccd-48ab-86e4-103e0dd3fc9e" (UID: "513edbab-8ccd-48ab-86e4-103e0dd3fc9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.438953 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "513edbab-8ccd-48ab-86e4-103e0dd3fc9e" (UID: "513edbab-8ccd-48ab-86e4-103e0dd3fc9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.501286 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cqht\" (UniqueName: \"kubernetes.io/projected/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-kube-api-access-7cqht\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.501328 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.501340 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.501350 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513edbab-8ccd-48ab-86e4-103e0dd3fc9e-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.651343 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30bca213-0fd3-4ac3-b075-f3deb4b54cfd","Type":"ContainerStarted","Data":"1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5"} Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.653837 4628 generic.go:334] "Generic (PLEG): container finished" podID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" containerID="b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca" exitCode=0 Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.654054 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.654450 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513edbab-8ccd-48ab-86e4-103e0dd3fc9e","Type":"ContainerDied","Data":"b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca"} Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.654526 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513edbab-8ccd-48ab-86e4-103e0dd3fc9e","Type":"ContainerDied","Data":"486e87f17bc7859a07d6b42379941680493f38d682fa222c409152ce59985f6b"} Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.654548 4628 scope.go:117] "RemoveContainer" containerID="b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.684037 4628 scope.go:117] "RemoveContainer" containerID="100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.693596 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.693580908 podStartE2EDuration="2.693580908s" podCreationTimestamp="2025-12-11 05:34:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:34:52.67023297 +0000 UTC m=+1195.087579678" watchObservedRunningTime="2025-12-11 05:34:52.693580908 +0000 UTC m=+1195.110927596" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.701479 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.724513 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.738985 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 11 05:34:52 crc kubenswrapper[4628]: E1211 05:34:52.739496 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" containerName="nova-api-api" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.739514 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" containerName="nova-api-api" Dec 11 05:34:52 crc kubenswrapper[4628]: E1211 05:34:52.739530 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" containerName="nova-api-log" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.739536 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" containerName="nova-api-log" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.739723 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" containerName="nova-api-api" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.739748 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" containerName="nova-api-log" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.740749 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.747260 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.747690 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.755668 4628 scope.go:117] "RemoveContainer" containerID="b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca" Dec 11 05:34:52 crc kubenswrapper[4628]: E1211 05:34:52.756130 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca\": container with ID starting with b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca not found: ID does not exist" containerID="b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.756168 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca"} err="failed to get container status \"b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca\": rpc error: code = NotFound desc = could not find container \"b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca\": container with ID starting with b3727ba33efdfdca1f868d1099eb0f486e3a1854d9c451d990454747c73cddca not found: ID does not exist" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.756194 4628 scope.go:117] "RemoveContainer" containerID="100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc" Dec 11 05:34:52 crc kubenswrapper[4628]: E1211 05:34:52.756553 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc\": container with ID starting with 100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc not found: ID does not exist" containerID="100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.756586 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc"} err="failed to get container status \"100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc\": rpc error: code = NotFound desc = could not find container \"100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc\": container with ID starting with 100b19f66f91eb948a669a28deaf9f10ac734b8488c81c1d62870220dbcbd2fc not found: ID does not exist" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.809359 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ed609a-2df2-423d-a74c-5f7285009a49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.810041 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ed609a-2df2-423d-a74c-5f7285009a49-config-data\") pod \"nova-api-0\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.810097 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt8rc\" (UniqueName: \"kubernetes.io/projected/f2ed609a-2df2-423d-a74c-5f7285009a49-kube-api-access-wt8rc\") pod \"nova-api-0\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.810174 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2ed609a-2df2-423d-a74c-5f7285009a49-logs\") pod \"nova-api-0\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.911413 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ed609a-2df2-423d-a74c-5f7285009a49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.911663 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ed609a-2df2-423d-a74c-5f7285009a49-config-data\") pod \"nova-api-0\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.911771 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt8rc\" (UniqueName: \"kubernetes.io/projected/f2ed609a-2df2-423d-a74c-5f7285009a49-kube-api-access-wt8rc\") pod \"nova-api-0\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.911959 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2ed609a-2df2-423d-a74c-5f7285009a49-logs\") pod \"nova-api-0\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.912515 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2ed609a-2df2-423d-a74c-5f7285009a49-logs\") pod \"nova-api-0\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.921669 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ed609a-2df2-423d-a74c-5f7285009a49-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.922880 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ed609a-2df2-423d-a74c-5f7285009a49-config-data\") pod \"nova-api-0\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " pod="openstack/nova-api-0" Dec 11 05:34:52 crc kubenswrapper[4628]: I1211 05:34:52.938000 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt8rc\" (UniqueName: \"kubernetes.io/projected/f2ed609a-2df2-423d-a74c-5f7285009a49-kube-api-access-wt8rc\") pod \"nova-api-0\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " pod="openstack/nova-api-0" Dec 11 05:34:53 crc kubenswrapper[4628]: I1211 05:34:53.069338 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:34:53 crc kubenswrapper[4628]: I1211 05:34:53.507077 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:34:53 crc kubenswrapper[4628]: I1211 05:34:53.665631 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2ed609a-2df2-423d-a74c-5f7285009a49","Type":"ContainerStarted","Data":"bcd2606400113e9a1955ffb95bc3590cfe831381691cf9e69c4612ee7b222bd7"} Dec 11 05:34:53 crc kubenswrapper[4628]: I1211 05:34:53.922732 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="513edbab-8ccd-48ab-86e4-103e0dd3fc9e" path="/var/lib/kubelet/pods/513edbab-8ccd-48ab-86e4-103e0dd3fc9e/volumes" Dec 11 05:34:54 crc kubenswrapper[4628]: I1211 05:34:54.683114 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2ed609a-2df2-423d-a74c-5f7285009a49","Type":"ContainerStarted","Data":"329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4"} Dec 11 05:34:54 crc kubenswrapper[4628]: I1211 05:34:54.683483 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2ed609a-2df2-423d-a74c-5f7285009a49","Type":"ContainerStarted","Data":"35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89"} Dec 11 05:34:55 crc kubenswrapper[4628]: I1211 05:34:55.141939 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 11 05:34:55 crc kubenswrapper[4628]: I1211 05:34:55.142344 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 11 05:34:55 crc kubenswrapper[4628]: I1211 05:34:55.201804 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.201767897 podStartE2EDuration="3.201767897s" podCreationTimestamp="2025-12-11 05:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:34:55.17098369 +0000 UTC m=+1197.588330418" watchObservedRunningTime="2025-12-11 05:34:55.201767897 +0000 UTC m=+1197.619114635" Dec 11 05:34:56 crc kubenswrapper[4628]: I1211 05:34:56.048324 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 11 05:34:59 crc kubenswrapper[4628]: I1211 05:34:59.035195 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 11 05:35:00 crc kubenswrapper[4628]: I1211 05:35:00.142329 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 11 05:35:00 crc kubenswrapper[4628]: I1211 05:35:00.145978 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 11 05:35:01 crc kubenswrapper[4628]: I1211 05:35:01.049125 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 11 05:35:01 crc kubenswrapper[4628]: I1211 05:35:01.087003 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 11 05:35:01 crc kubenswrapper[4628]: I1211 05:35:01.162312 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 05:35:01 crc kubenswrapper[4628]: I1211 05:35:01.162500 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 05:35:01 crc kubenswrapper[4628]: I1211 05:35:01.825533 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 11 05:35:03 crc kubenswrapper[4628]: I1211 05:35:03.069782 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 05:35:03 crc kubenswrapper[4628]: I1211 05:35:03.069894 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 05:35:04 crc kubenswrapper[4628]: I1211 05:35:04.111167 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f2ed609a-2df2-423d-a74c-5f7285009a49" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 05:35:04 crc kubenswrapper[4628]: I1211 05:35:04.152447 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f2ed609a-2df2-423d-a74c-5f7285009a49" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 11 05:35:10 crc kubenswrapper[4628]: I1211 05:35:10.149552 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 11 05:35:10 crc kubenswrapper[4628]: I1211 05:35:10.160378 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 11 05:35:10 crc kubenswrapper[4628]: I1211 05:35:10.162783 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 11 05:35:10 crc kubenswrapper[4628]: I1211 05:35:10.916421 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 11 05:35:12 crc kubenswrapper[4628]: I1211 05:35:12.949354 4628 generic.go:334] "Generic (PLEG): container finished" podID="bddbc62d-5dcc-4ad3-9e74-adf443315395" containerID="520f93202cb4c3be86306ac0ef9a3f0ec65b385db5c541efe42cc93aa795bde3" exitCode=137 Dec 11 05:35:12 crc kubenswrapper[4628]: I1211 05:35:12.949437 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bddbc62d-5dcc-4ad3-9e74-adf443315395","Type":"ContainerDied","Data":"520f93202cb4c3be86306ac0ef9a3f0ec65b385db5c541efe42cc93aa795bde3"} Dec 11 05:35:12 crc kubenswrapper[4628]: I1211 05:35:12.950065 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bddbc62d-5dcc-4ad3-9e74-adf443315395","Type":"ContainerDied","Data":"6b189f099e5cb15aa2011252b1d95c786f98ec0506d2c12e98aa2561a618e2d9"} Dec 11 05:35:12 crc kubenswrapper[4628]: I1211 05:35:12.950079 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b189f099e5cb15aa2011252b1d95c786f98ec0506d2c12e98aa2561a618e2d9" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.008270 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.060322 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bddbc62d-5dcc-4ad3-9e74-adf443315395-combined-ca-bundle\") pod \"bddbc62d-5dcc-4ad3-9e74-adf443315395\" (UID: \"bddbc62d-5dcc-4ad3-9e74-adf443315395\") " Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.060623 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zlw5\" (UniqueName: \"kubernetes.io/projected/bddbc62d-5dcc-4ad3-9e74-adf443315395-kube-api-access-4zlw5\") pod \"bddbc62d-5dcc-4ad3-9e74-adf443315395\" (UID: \"bddbc62d-5dcc-4ad3-9e74-adf443315395\") " Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.060665 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bddbc62d-5dcc-4ad3-9e74-adf443315395-config-data\") pod \"bddbc62d-5dcc-4ad3-9e74-adf443315395\" (UID: \"bddbc62d-5dcc-4ad3-9e74-adf443315395\") " Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.071062 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bddbc62d-5dcc-4ad3-9e74-adf443315395-kube-api-access-4zlw5" (OuterVolumeSpecName: "kube-api-access-4zlw5") pod "bddbc62d-5dcc-4ad3-9e74-adf443315395" (UID: "bddbc62d-5dcc-4ad3-9e74-adf443315395"). InnerVolumeSpecName "kube-api-access-4zlw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.077638 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.078272 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.079227 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.082486 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.086745 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bddbc62d-5dcc-4ad3-9e74-adf443315395-config-data" (OuterVolumeSpecName: "config-data") pod "bddbc62d-5dcc-4ad3-9e74-adf443315395" (UID: "bddbc62d-5dcc-4ad3-9e74-adf443315395"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.104777 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bddbc62d-5dcc-4ad3-9e74-adf443315395-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bddbc62d-5dcc-4ad3-9e74-adf443315395" (UID: "bddbc62d-5dcc-4ad3-9e74-adf443315395"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.163033 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bddbc62d-5dcc-4ad3-9e74-adf443315395-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.163070 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zlw5\" (UniqueName: \"kubernetes.io/projected/bddbc62d-5dcc-4ad3-9e74-adf443315395-kube-api-access-4zlw5\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.163088 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bddbc62d-5dcc-4ad3-9e74-adf443315395-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.960224 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.961128 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 11 05:35:13 crc kubenswrapper[4628]: I1211 05:35:13.973517 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.008924 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.014189 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.068488 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 05:35:14 crc kubenswrapper[4628]: E1211 05:35:14.070649 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bddbc62d-5dcc-4ad3-9e74-adf443315395" containerName="nova-cell1-novncproxy-novncproxy" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.070671 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="bddbc62d-5dcc-4ad3-9e74-adf443315395" containerName="nova-cell1-novncproxy-novncproxy" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.071052 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="bddbc62d-5dcc-4ad3-9e74-adf443315395" containerName="nova-cell1-novncproxy-novncproxy" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.071887 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.079514 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.080026 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.080310 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.094852 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.185682 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.185725 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.185820 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.185854 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.185877 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phmdj\" (UniqueName: \"kubernetes.io/projected/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-kube-api-access-phmdj\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.215670 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-pwlzk"] Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.221916 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.245724 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-pwlzk"] Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.291399 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmdj\" (UniqueName: \"kubernetes.io/projected/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-kube-api-access-phmdj\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.291484 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.291509 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.291524 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.291541 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.291558 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.291622 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wslmq\" (UniqueName: \"kubernetes.io/projected/2f5158e3-ab0a-4ceb-af73-55994e618c50-kube-api-access-wslmq\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.291653 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.291669 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.291688 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.291702 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-config\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.296459 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.296774 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.297028 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.315717 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmdj\" (UniqueName: \"kubernetes.io/projected/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-kube-api-access-phmdj\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.319476 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b29ff2-7a02-42ed-9dde-d998ad2e693f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d1b29ff2-7a02-42ed-9dde-d998ad2e693f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.394518 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.394558 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.394582 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.394660 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wslmq\" (UniqueName: \"kubernetes.io/projected/2f5158e3-ab0a-4ceb-af73-55994e618c50-kube-api-access-wslmq\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.394696 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.394717 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-config\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.395515 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.395522 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-config\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.395673 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.396174 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.396253 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.410010 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.429515 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wslmq\" (UniqueName: \"kubernetes.io/projected/2f5158e3-ab0a-4ceb-af73-55994e618c50-kube-api-access-wslmq\") pod \"dnsmasq-dns-89c5cd4d5-pwlzk\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.547601 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:14 crc kubenswrapper[4628]: I1211 05:35:14.970312 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 11 05:35:15 crc kubenswrapper[4628]: I1211 05:35:15.136241 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-pwlzk"] Dec 11 05:35:15 crc kubenswrapper[4628]: I1211 05:35:15.899277 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bddbc62d-5dcc-4ad3-9e74-adf443315395" path="/var/lib/kubelet/pods/bddbc62d-5dcc-4ad3-9e74-adf443315395/volumes" Dec 11 05:35:15 crc kubenswrapper[4628]: I1211 05:35:15.977224 4628 generic.go:334] "Generic (PLEG): container finished" podID="2f5158e3-ab0a-4ceb-af73-55994e618c50" containerID="7f6644e0b0532911a00388ffc4629a0331ab96cff590ae21f17f553d471c533e" exitCode=0 Dec 11 05:35:15 crc kubenswrapper[4628]: I1211 05:35:15.977267 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" event={"ID":"2f5158e3-ab0a-4ceb-af73-55994e618c50","Type":"ContainerDied","Data":"7f6644e0b0532911a00388ffc4629a0331ab96cff590ae21f17f553d471c533e"} Dec 11 05:35:15 crc kubenswrapper[4628]: I1211 05:35:15.977304 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" event={"ID":"2f5158e3-ab0a-4ceb-af73-55994e618c50","Type":"ContainerStarted","Data":"88a1fa0f21cd76a818b512ba0fe7fa8053324a0066b1b6bffeece46b1af38660"} Dec 11 05:35:15 crc kubenswrapper[4628]: I1211 05:35:15.979017 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d1b29ff2-7a02-42ed-9dde-d998ad2e693f","Type":"ContainerStarted","Data":"c2cf6e8e593c3409a4d8158c0da1e2567ef92c7bf9de1336f1d9a312197ac9ab"} Dec 11 05:35:15 crc kubenswrapper[4628]: I1211 05:35:15.979051 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d1b29ff2-7a02-42ed-9dde-d998ad2e693f","Type":"ContainerStarted","Data":"508de6c411aaba9e94be433dc17c642ca8baa6259d29e2c27b3e99c531d6b55f"} Dec 11 05:35:16 crc kubenswrapper[4628]: I1211 05:35:16.592286 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.592268283 podStartE2EDuration="2.592268283s" podCreationTimestamp="2025-12-11 05:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:35:16.065505931 +0000 UTC m=+1218.482852639" watchObservedRunningTime="2025-12-11 05:35:16.592268283 +0000 UTC m=+1219.009614981" Dec 11 05:35:16 crc kubenswrapper[4628]: I1211 05:35:16.599478 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:35:16 crc kubenswrapper[4628]: I1211 05:35:16.599761 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="ceilometer-central-agent" containerID="cri-o://898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb" gracePeriod=30 Dec 11 05:35:16 crc kubenswrapper[4628]: I1211 05:35:16.599829 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="proxy-httpd" containerID="cri-o://52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad" gracePeriod=30 Dec 11 05:35:16 crc kubenswrapper[4628]: I1211 05:35:16.599909 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="ceilometer-notification-agent" containerID="cri-o://6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd" gracePeriod=30 Dec 11 05:35:16 crc kubenswrapper[4628]: I1211 05:35:16.600014 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="sg-core" containerID="cri-o://a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7" gracePeriod=30 Dec 11 05:35:17 crc kubenswrapper[4628]: I1211 05:35:17.000098 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" event={"ID":"2f5158e3-ab0a-4ceb-af73-55994e618c50","Type":"ContainerStarted","Data":"4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c"} Dec 11 05:35:17 crc kubenswrapper[4628]: I1211 05:35:17.001201 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:17 crc kubenswrapper[4628]: I1211 05:35:17.004280 4628 generic.go:334] "Generic (PLEG): container finished" podID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerID="a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7" exitCode=2 Dec 11 05:35:17 crc kubenswrapper[4628]: I1211 05:35:17.004892 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350c7aef-5a63-4478-b857-a2ad272d4d75","Type":"ContainerDied","Data":"a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7"} Dec 11 05:35:17 crc kubenswrapper[4628]: I1211 05:35:17.028804 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:35:17 crc kubenswrapper[4628]: I1211 05:35:17.029264 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f2ed609a-2df2-423d-a74c-5f7285009a49" containerName="nova-api-api" containerID="cri-o://35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89" gracePeriod=30 Dec 11 05:35:17 crc kubenswrapper[4628]: I1211 05:35:17.029494 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f2ed609a-2df2-423d-a74c-5f7285009a49" containerName="nova-api-log" containerID="cri-o://329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4" gracePeriod=30 Dec 11 05:35:17 crc kubenswrapper[4628]: I1211 05:35:17.032180 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" podStartSLOduration=3.032158032 podStartE2EDuration="3.032158032s" podCreationTimestamp="2025-12-11 05:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:35:17.020987372 +0000 UTC m=+1219.438334090" watchObservedRunningTime="2025-12-11 05:35:17.032158032 +0000 UTC m=+1219.449504730" Dec 11 05:35:18 crc kubenswrapper[4628]: I1211 05:35:18.013585 4628 generic.go:334] "Generic (PLEG): container finished" podID="f2ed609a-2df2-423d-a74c-5f7285009a49" containerID="329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4" exitCode=143 Dec 11 05:35:18 crc kubenswrapper[4628]: I1211 05:35:18.013643 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2ed609a-2df2-423d-a74c-5f7285009a49","Type":"ContainerDied","Data":"329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4"} Dec 11 05:35:18 crc kubenswrapper[4628]: I1211 05:35:18.017063 4628 generic.go:334] "Generic (PLEG): container finished" podID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerID="52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad" exitCode=0 Dec 11 05:35:18 crc kubenswrapper[4628]: I1211 05:35:18.017093 4628 generic.go:334] "Generic (PLEG): container finished" podID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerID="898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb" exitCode=0 Dec 11 05:35:18 crc kubenswrapper[4628]: I1211 05:35:18.017088 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350c7aef-5a63-4478-b857-a2ad272d4d75","Type":"ContainerDied","Data":"52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad"} Dec 11 05:35:18 crc kubenswrapper[4628]: I1211 05:35:18.017117 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350c7aef-5a63-4478-b857-a2ad272d4d75","Type":"ContainerDied","Data":"898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb"} Dec 11 05:35:19 crc kubenswrapper[4628]: I1211 05:35:19.411312 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.657315 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.731736 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt8rc\" (UniqueName: \"kubernetes.io/projected/f2ed609a-2df2-423d-a74c-5f7285009a49-kube-api-access-wt8rc\") pod \"f2ed609a-2df2-423d-a74c-5f7285009a49\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.731824 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ed609a-2df2-423d-a74c-5f7285009a49-config-data\") pod \"f2ed609a-2df2-423d-a74c-5f7285009a49\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.731880 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2ed609a-2df2-423d-a74c-5f7285009a49-logs\") pod \"f2ed609a-2df2-423d-a74c-5f7285009a49\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.731998 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ed609a-2df2-423d-a74c-5f7285009a49-combined-ca-bundle\") pod \"f2ed609a-2df2-423d-a74c-5f7285009a49\" (UID: \"f2ed609a-2df2-423d-a74c-5f7285009a49\") " Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.733290 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2ed609a-2df2-423d-a74c-5f7285009a49-logs" (OuterVolumeSpecName: "logs") pod "f2ed609a-2df2-423d-a74c-5f7285009a49" (UID: "f2ed609a-2df2-423d-a74c-5f7285009a49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.738228 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2ed609a-2df2-423d-a74c-5f7285009a49-kube-api-access-wt8rc" (OuterVolumeSpecName: "kube-api-access-wt8rc") pod "f2ed609a-2df2-423d-a74c-5f7285009a49" (UID: "f2ed609a-2df2-423d-a74c-5f7285009a49"). InnerVolumeSpecName "kube-api-access-wt8rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.771255 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ed609a-2df2-423d-a74c-5f7285009a49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2ed609a-2df2-423d-a74c-5f7285009a49" (UID: "f2ed609a-2df2-423d-a74c-5f7285009a49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.797749 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ed609a-2df2-423d-a74c-5f7285009a49-config-data" (OuterVolumeSpecName: "config-data") pod "f2ed609a-2df2-423d-a74c-5f7285009a49" (UID: "f2ed609a-2df2-423d-a74c-5f7285009a49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.833477 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ed609a-2df2-423d-a74c-5f7285009a49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.833511 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt8rc\" (UniqueName: \"kubernetes.io/projected/f2ed609a-2df2-423d-a74c-5f7285009a49-kube-api-access-wt8rc\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.833524 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ed609a-2df2-423d-a74c-5f7285009a49-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:20 crc kubenswrapper[4628]: I1211 05:35:20.833533 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2ed609a-2df2-423d-a74c-5f7285009a49-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.048182 4628 generic.go:334] "Generic (PLEG): container finished" podID="f2ed609a-2df2-423d-a74c-5f7285009a49" containerID="35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89" exitCode=0 Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.048232 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2ed609a-2df2-423d-a74c-5f7285009a49","Type":"ContainerDied","Data":"35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89"} Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.048259 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2ed609a-2df2-423d-a74c-5f7285009a49","Type":"ContainerDied","Data":"bcd2606400113e9a1955ffb95bc3590cfe831381691cf9e69c4612ee7b222bd7"} Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.048275 4628 scope.go:117] "RemoveContainer" containerID="35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.048280 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.092793 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.098544 4628 scope.go:117] "RemoveContainer" containerID="329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.103875 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.123967 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 11 05:35:21 crc kubenswrapper[4628]: E1211 05:35:21.124295 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ed609a-2df2-423d-a74c-5f7285009a49" containerName="nova-api-log" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.124311 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ed609a-2df2-423d-a74c-5f7285009a49" containerName="nova-api-log" Dec 11 05:35:21 crc kubenswrapper[4628]: E1211 05:35:21.124339 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ed609a-2df2-423d-a74c-5f7285009a49" containerName="nova-api-api" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.124346 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ed609a-2df2-423d-a74c-5f7285009a49" containerName="nova-api-api" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.124523 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ed609a-2df2-423d-a74c-5f7285009a49" containerName="nova-api-api" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.124537 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ed609a-2df2-423d-a74c-5f7285009a49" containerName="nova-api-log" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.125436 4628 scope.go:117] "RemoveContainer" containerID="35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.125448 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: E1211 05:35:21.125816 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89\": container with ID starting with 35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89 not found: ID does not exist" containerID="35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.125866 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89"} err="failed to get container status \"35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89\": rpc error: code = NotFound desc = could not find container \"35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89\": container with ID starting with 35123226cdca93b61dc48e92282177ced3fcc0a14643e529470f807ce5807d89 not found: ID does not exist" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.125888 4628 scope.go:117] "RemoveContainer" containerID="329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4" Dec 11 05:35:21 crc kubenswrapper[4628]: E1211 05:35:21.126598 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4\": container with ID starting with 329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4 not found: ID does not exist" containerID="329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.126630 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4"} err="failed to get container status \"329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4\": rpc error: code = NotFound desc = could not find container \"329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4\": container with ID starting with 329ba2f25aae8fac756d8183cef57005a0cbb6020743022f1f9f4620fca54db4 not found: ID does not exist" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.128424 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.129891 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.132625 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.142902 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.164412 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-public-tls-certs\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.164477 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.164567 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c99175af-e6c5-4808-ae11-7b324c1a6e0b-logs\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.164592 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-config-data\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.164827 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.164901 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvpfb\" (UniqueName: \"kubernetes.io/projected/c99175af-e6c5-4808-ae11-7b324c1a6e0b-kube-api-access-rvpfb\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.266867 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c99175af-e6c5-4808-ae11-7b324c1a6e0b-logs\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.266924 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-config-data\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.266991 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.267011 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvpfb\" (UniqueName: \"kubernetes.io/projected/c99175af-e6c5-4808-ae11-7b324c1a6e0b-kube-api-access-rvpfb\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.267086 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-public-tls-certs\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.267105 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.268093 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c99175af-e6c5-4808-ae11-7b324c1a6e0b-logs\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.273437 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-public-tls-certs\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.273836 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.276543 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-config-data\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.278468 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.285322 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvpfb\" (UniqueName: \"kubernetes.io/projected/c99175af-e6c5-4808-ae11-7b324c1a6e0b-kube-api-access-rvpfb\") pod \"nova-api-0\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.450105 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.904741 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2ed609a-2df2-423d-a74c-5f7285009a49" path="/var/lib/kubelet/pods/f2ed609a-2df2-423d-a74c-5f7285009a49/volumes" Dec 11 05:35:21 crc kubenswrapper[4628]: I1211 05:35:21.910273 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:35:22 crc kubenswrapper[4628]: I1211 05:35:22.062329 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c99175af-e6c5-4808-ae11-7b324c1a6e0b","Type":"ContainerStarted","Data":"d07872c05f27a59cb0e78830ecd11bc94c676c7f5fd7fc1d1fe98dd40965c962"} Dec 11 05:35:23 crc kubenswrapper[4628]: I1211 05:35:23.080526 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c99175af-e6c5-4808-ae11-7b324c1a6e0b","Type":"ContainerStarted","Data":"3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0"} Dec 11 05:35:23 crc kubenswrapper[4628]: I1211 05:35:23.080897 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c99175af-e6c5-4808-ae11-7b324c1a6e0b","Type":"ContainerStarted","Data":"49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0"} Dec 11 05:35:23 crc kubenswrapper[4628]: I1211 05:35:23.118941 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.118920856 podStartE2EDuration="2.118920856s" podCreationTimestamp="2025-12-11 05:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:35:23.109651158 +0000 UTC m=+1225.526997896" watchObservedRunningTime="2025-12-11 05:35:23.118920856 +0000 UTC m=+1225.536267564" Dec 11 05:35:24 crc kubenswrapper[4628]: I1211 05:35:24.411238 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:24 crc kubenswrapper[4628]: I1211 05:35:24.447898 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:24 crc kubenswrapper[4628]: I1211 05:35:24.549123 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:35:24 crc kubenswrapper[4628]: I1211 05:35:24.619625 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9mx9c"] Dec 11 05:35:24 crc kubenswrapper[4628]: I1211 05:35:24.620109 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" podUID="532398ad-d877-4e17-a86e-d403bc8a6678" containerName="dnsmasq-dns" containerID="cri-o://46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541" gracePeriod=10 Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.078100 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.105423 4628 generic.go:334] "Generic (PLEG): container finished" podID="532398ad-d877-4e17-a86e-d403bc8a6678" containerID="46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541" exitCode=0 Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.106576 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.106976 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" event={"ID":"532398ad-d877-4e17-a86e-d403bc8a6678","Type":"ContainerDied","Data":"46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541"} Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.107000 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-9mx9c" event={"ID":"532398ad-d877-4e17-a86e-d403bc8a6678","Type":"ContainerDied","Data":"0af8af14f1977b99f7f75e2bc22b833a3d2c1a99dc96510621756d33bdf77d35"} Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.107018 4628 scope.go:117] "RemoveContainer" containerID="46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.128884 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.133424 4628 scope.go:117] "RemoveContainer" containerID="c0d5305660274726e29d26e7cf1d7fb50542de05edc3e97a9c67bed0f7743aa8" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.156294 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8d4b\" (UniqueName: \"kubernetes.io/projected/532398ad-d877-4e17-a86e-d403bc8a6678-kube-api-access-g8d4b\") pod \"532398ad-d877-4e17-a86e-d403bc8a6678\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.156373 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-config\") pod \"532398ad-d877-4e17-a86e-d403bc8a6678\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.156503 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-ovsdbserver-sb\") pod \"532398ad-d877-4e17-a86e-d403bc8a6678\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.156542 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-dns-svc\") pod \"532398ad-d877-4e17-a86e-d403bc8a6678\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.156702 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-dns-swift-storage-0\") pod \"532398ad-d877-4e17-a86e-d403bc8a6678\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.156799 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-ovsdbserver-nb\") pod \"532398ad-d877-4e17-a86e-d403bc8a6678\" (UID: \"532398ad-d877-4e17-a86e-d403bc8a6678\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.188051 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532398ad-d877-4e17-a86e-d403bc8a6678-kube-api-access-g8d4b" (OuterVolumeSpecName: "kube-api-access-g8d4b") pod "532398ad-d877-4e17-a86e-d403bc8a6678" (UID: "532398ad-d877-4e17-a86e-d403bc8a6678"). InnerVolumeSpecName "kube-api-access-g8d4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.188202 4628 scope.go:117] "RemoveContainer" containerID="46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541" Dec 11 05:35:25 crc kubenswrapper[4628]: E1211 05:35:25.193652 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541\": container with ID starting with 46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541 not found: ID does not exist" containerID="46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.193789 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541"} err="failed to get container status \"46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541\": rpc error: code = NotFound desc = could not find container \"46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541\": container with ID starting with 46d39b194853d7bbeebac6326bd199cc66aa51ad42e52b74f38f0ca053f49541 not found: ID does not exist" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.193918 4628 scope.go:117] "RemoveContainer" containerID="c0d5305660274726e29d26e7cf1d7fb50542de05edc3e97a9c67bed0f7743aa8" Dec 11 05:35:25 crc kubenswrapper[4628]: E1211 05:35:25.200247 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d5305660274726e29d26e7cf1d7fb50542de05edc3e97a9c67bed0f7743aa8\": container with ID starting with c0d5305660274726e29d26e7cf1d7fb50542de05edc3e97a9c67bed0f7743aa8 not found: ID does not exist" containerID="c0d5305660274726e29d26e7cf1d7fb50542de05edc3e97a9c67bed0f7743aa8" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.200293 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d5305660274726e29d26e7cf1d7fb50542de05edc3e97a9c67bed0f7743aa8"} err="failed to get container status \"c0d5305660274726e29d26e7cf1d7fb50542de05edc3e97a9c67bed0f7743aa8\": rpc error: code = NotFound desc = could not find container \"c0d5305660274726e29d26e7cf1d7fb50542de05edc3e97a9c67bed0f7743aa8\": container with ID starting with c0d5305660274726e29d26e7cf1d7fb50542de05edc3e97a9c67bed0f7743aa8 not found: ID does not exist" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.249021 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "532398ad-d877-4e17-a86e-d403bc8a6678" (UID: "532398ad-d877-4e17-a86e-d403bc8a6678"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.261473 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8d4b\" (UniqueName: \"kubernetes.io/projected/532398ad-d877-4e17-a86e-d403bc8a6678-kube-api-access-g8d4b\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.261505 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.266183 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "532398ad-d877-4e17-a86e-d403bc8a6678" (UID: "532398ad-d877-4e17-a86e-d403bc8a6678"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.283316 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "532398ad-d877-4e17-a86e-d403bc8a6678" (UID: "532398ad-d877-4e17-a86e-d403bc8a6678"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.285421 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-z4bpl"] Dec 11 05:35:25 crc kubenswrapper[4628]: E1211 05:35:25.287251 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532398ad-d877-4e17-a86e-d403bc8a6678" containerName="dnsmasq-dns" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.287274 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="532398ad-d877-4e17-a86e-d403bc8a6678" containerName="dnsmasq-dns" Dec 11 05:35:25 crc kubenswrapper[4628]: E1211 05:35:25.287310 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532398ad-d877-4e17-a86e-d403bc8a6678" containerName="init" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.287316 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="532398ad-d877-4e17-a86e-d403bc8a6678" containerName="init" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.287566 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="532398ad-d877-4e17-a86e-d403bc8a6678" containerName="dnsmasq-dns" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.288203 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.295193 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.295423 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.297402 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z4bpl"] Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.316638 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "532398ad-d877-4e17-a86e-d403bc8a6678" (UID: "532398ad-d877-4e17-a86e-d403bc8a6678"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.322313 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-config" (OuterVolumeSpecName: "config") pod "532398ad-d877-4e17-a86e-d403bc8a6678" (UID: "532398ad-d877-4e17-a86e-d403bc8a6678"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.362669 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z4bpl\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.362739 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-config-data\") pod \"nova-cell1-cell-mapping-z4bpl\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.362880 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhspt\" (UniqueName: \"kubernetes.io/projected/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-kube-api-access-hhspt\") pod \"nova-cell1-cell-mapping-z4bpl\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.363025 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-scripts\") pod \"nova-cell1-cell-mapping-z4bpl\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.363186 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.363204 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.363216 4628 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.363225 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532398ad-d877-4e17-a86e-d403bc8a6678-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.433766 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9mx9c"] Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.442150 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-9mx9c"] Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.465265 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhspt\" (UniqueName: \"kubernetes.io/projected/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-kube-api-access-hhspt\") pod \"nova-cell1-cell-mapping-z4bpl\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.465375 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-scripts\") pod \"nova-cell1-cell-mapping-z4bpl\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.465436 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z4bpl\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.465478 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-config-data\") pod \"nova-cell1-cell-mapping-z4bpl\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.468642 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z4bpl\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.468947 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-config-data\") pod \"nova-cell1-cell-mapping-z4bpl\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.469301 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-scripts\") pod \"nova-cell1-cell-mapping-z4bpl\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.481456 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhspt\" (UniqueName: \"kubernetes.io/projected/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-kube-api-access-hhspt\") pod \"nova-cell1-cell-mapping-z4bpl\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.624172 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.904173 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532398ad-d877-4e17-a86e-d403bc8a6678" path="/var/lib/kubelet/pods/532398ad-d877-4e17-a86e-d403bc8a6678/volumes" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.914515 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.978529 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-combined-ca-bundle\") pod \"350c7aef-5a63-4478-b857-a2ad272d4d75\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.978586 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlcf9\" (UniqueName: \"kubernetes.io/projected/350c7aef-5a63-4478-b857-a2ad272d4d75-kube-api-access-vlcf9\") pod \"350c7aef-5a63-4478-b857-a2ad272d4d75\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.978675 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350c7aef-5a63-4478-b857-a2ad272d4d75-log-httpd\") pod \"350c7aef-5a63-4478-b857-a2ad272d4d75\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.978716 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-sg-core-conf-yaml\") pod \"350c7aef-5a63-4478-b857-a2ad272d4d75\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.979168 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-config-data\") pod \"350c7aef-5a63-4478-b857-a2ad272d4d75\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.979225 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350c7aef-5a63-4478-b857-a2ad272d4d75-run-httpd\") pod \"350c7aef-5a63-4478-b857-a2ad272d4d75\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.979274 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/350c7aef-5a63-4478-b857-a2ad272d4d75-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "350c7aef-5a63-4478-b857-a2ad272d4d75" (UID: "350c7aef-5a63-4478-b857-a2ad272d4d75"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.979307 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-scripts\") pod \"350c7aef-5a63-4478-b857-a2ad272d4d75\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.979398 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-ceilometer-tls-certs\") pod \"350c7aef-5a63-4478-b857-a2ad272d4d75\" (UID: \"350c7aef-5a63-4478-b857-a2ad272d4d75\") " Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.979641 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/350c7aef-5a63-4478-b857-a2ad272d4d75-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "350c7aef-5a63-4478-b857-a2ad272d4d75" (UID: "350c7aef-5a63-4478-b857-a2ad272d4d75"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.980184 4628 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350c7aef-5a63-4478-b857-a2ad272d4d75-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.980215 4628 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/350c7aef-5a63-4478-b857-a2ad272d4d75-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:25 crc kubenswrapper[4628]: I1211 05:35:25.991047 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/350c7aef-5a63-4478-b857-a2ad272d4d75-kube-api-access-vlcf9" (OuterVolumeSpecName: "kube-api-access-vlcf9") pod "350c7aef-5a63-4478-b857-a2ad272d4d75" (UID: "350c7aef-5a63-4478-b857-a2ad272d4d75"). InnerVolumeSpecName "kube-api-access-vlcf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.023545 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-scripts" (OuterVolumeSpecName: "scripts") pod "350c7aef-5a63-4478-b857-a2ad272d4d75" (UID: "350c7aef-5a63-4478-b857-a2ad272d4d75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.035740 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "350c7aef-5a63-4478-b857-a2ad272d4d75" (UID: "350c7aef-5a63-4478-b857-a2ad272d4d75"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.075822 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "350c7aef-5a63-4478-b857-a2ad272d4d75" (UID: "350c7aef-5a63-4478-b857-a2ad272d4d75"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.082438 4628 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.082605 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlcf9\" (UniqueName: \"kubernetes.io/projected/350c7aef-5a63-4478-b857-a2ad272d4d75-kube-api-access-vlcf9\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.082690 4628 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.082765 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.124315 4628 generic.go:334] "Generic (PLEG): container finished" podID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerID="6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd" exitCode=0 Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.125030 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.125032 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350c7aef-5a63-4478-b857-a2ad272d4d75","Type":"ContainerDied","Data":"6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd"} Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.125655 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"350c7aef-5a63-4478-b857-a2ad272d4d75","Type":"ContainerDied","Data":"4cbb2495533b619e63e5c5267286d8b272b3253edf7a81c419c0675ec7e14a64"} Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.125682 4628 scope.go:117] "RemoveContainer" containerID="52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.133247 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "350c7aef-5a63-4478-b857-a2ad272d4d75" (UID: "350c7aef-5a63-4478-b857-a2ad272d4d75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.152355 4628 scope.go:117] "RemoveContainer" containerID="a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7" Dec 11 05:35:26 crc kubenswrapper[4628]: W1211 05:35:26.158114 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c58618e_b18c_4ef8_8bc9_d3b08fc5b5af.slice/crio-54f1a30daf833e3c13d9f4f062e023c38239b249d38ea5cf6b480d23bc91bf2e WatchSource:0}: Error finding container 54f1a30daf833e3c13d9f4f062e023c38239b249d38ea5cf6b480d23bc91bf2e: Status 404 returned error can't find the container with id 54f1a30daf833e3c13d9f4f062e023c38239b249d38ea5cf6b480d23bc91bf2e Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.158470 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z4bpl"] Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.162937 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-config-data" (OuterVolumeSpecName: "config-data") pod "350c7aef-5a63-4478-b857-a2ad272d4d75" (UID: "350c7aef-5a63-4478-b857-a2ad272d4d75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.173183 4628 scope.go:117] "RemoveContainer" containerID="6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.184653 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.184677 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/350c7aef-5a63-4478-b857-a2ad272d4d75-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.198167 4628 scope.go:117] "RemoveContainer" containerID="898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.229084 4628 scope.go:117] "RemoveContainer" containerID="52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad" Dec 11 05:35:26 crc kubenswrapper[4628]: E1211 05:35:26.230754 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad\": container with ID starting with 52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad not found: ID does not exist" containerID="52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.230865 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad"} err="failed to get container status \"52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad\": rpc error: code = NotFound desc = could not find container \"52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad\": container with ID starting with 52b73fcf7c230bd4a08743ed7128297c2aa9e77b6cce65dc8a684d29ce2d1dad not found: ID does not exist" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.230948 4628 scope.go:117] "RemoveContainer" containerID="a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7" Dec 11 05:35:26 crc kubenswrapper[4628]: E1211 05:35:26.231413 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7\": container with ID starting with a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7 not found: ID does not exist" containerID="a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.231453 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7"} err="failed to get container status \"a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7\": rpc error: code = NotFound desc = could not find container \"a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7\": container with ID starting with a869f35e461c880c9010550466393fc2940504ca684c65001162c9583928b4b7 not found: ID does not exist" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.231475 4628 scope.go:117] "RemoveContainer" containerID="6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd" Dec 11 05:35:26 crc kubenswrapper[4628]: E1211 05:35:26.231837 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd\": container with ID starting with 6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd not found: ID does not exist" containerID="6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.231937 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd"} err="failed to get container status \"6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd\": rpc error: code = NotFound desc = could not find container \"6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd\": container with ID starting with 6671190ee3a13597797c780eef9e845e170e76223351084c55af2c87cc5c77fd not found: ID does not exist" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.231954 4628 scope.go:117] "RemoveContainer" containerID="898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb" Dec 11 05:35:26 crc kubenswrapper[4628]: E1211 05:35:26.232292 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb\": container with ID starting with 898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb not found: ID does not exist" containerID="898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.232322 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb"} err="failed to get container status \"898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb\": rpc error: code = NotFound desc = could not find container \"898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb\": container with ID starting with 898858b28c93e7d104b50ad75512cf245459626a2ed9d4aa3cd6a7b9ff531fcb not found: ID does not exist" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.457821 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.467156 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.486959 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:35:26 crc kubenswrapper[4628]: E1211 05:35:26.487365 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="sg-core" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.487381 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="sg-core" Dec 11 05:35:26 crc kubenswrapper[4628]: E1211 05:35:26.487395 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="ceilometer-notification-agent" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.487402 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="ceilometer-notification-agent" Dec 11 05:35:26 crc kubenswrapper[4628]: E1211 05:35:26.487424 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="proxy-httpd" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.487431 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="proxy-httpd" Dec 11 05:35:26 crc kubenswrapper[4628]: E1211 05:35:26.487450 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="ceilometer-central-agent" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.487457 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="ceilometer-central-agent" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.487627 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="ceilometer-central-agent" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.487641 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="proxy-httpd" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.487657 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="sg-core" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.487665 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" containerName="ceilometer-notification-agent" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.489289 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.493370 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.493523 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.493710 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.504204 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.591539 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbhdk\" (UniqueName: \"kubernetes.io/projected/e0091ba0-9c70-41dd-8f21-68968a10a308-kube-api-access-gbhdk\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.591581 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.591612 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0091ba0-9c70-41dd-8f21-68968a10a308-log-httpd\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.591658 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.591745 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0091ba0-9c70-41dd-8f21-68968a10a308-run-httpd\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.591767 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-scripts\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.591783 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.591805 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-config-data\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.693781 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0091ba0-9c70-41dd-8f21-68968a10a308-run-httpd\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.693833 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-scripts\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.693880 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.693908 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-config-data\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.693945 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbhdk\" (UniqueName: \"kubernetes.io/projected/e0091ba0-9c70-41dd-8f21-68968a10a308-kube-api-access-gbhdk\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.693966 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.693988 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0091ba0-9c70-41dd-8f21-68968a10a308-log-httpd\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.694034 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.694629 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0091ba0-9c70-41dd-8f21-68968a10a308-log-httpd\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.695221 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0091ba0-9c70-41dd-8f21-68968a10a308-run-httpd\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.697972 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.697992 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.698486 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-scripts\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.698738 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.699616 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0091ba0-9c70-41dd-8f21-68968a10a308-config-data\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.711320 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbhdk\" (UniqueName: \"kubernetes.io/projected/e0091ba0-9c70-41dd-8f21-68968a10a308-kube-api-access-gbhdk\") pod \"ceilometer-0\" (UID: \"e0091ba0-9c70-41dd-8f21-68968a10a308\") " pod="openstack/ceilometer-0" Dec 11 05:35:26 crc kubenswrapper[4628]: I1211 05:35:26.810016 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 11 05:35:27 crc kubenswrapper[4628]: I1211 05:35:27.137317 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z4bpl" event={"ID":"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af","Type":"ContainerStarted","Data":"8f835c5e648cc10d9e830c9db1308e78091fbd6c868b72b8ca834ea31b0a95c4"} Dec 11 05:35:27 crc kubenswrapper[4628]: I1211 05:35:27.137628 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z4bpl" event={"ID":"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af","Type":"ContainerStarted","Data":"54f1a30daf833e3c13d9f4f062e023c38239b249d38ea5cf6b480d23bc91bf2e"} Dec 11 05:35:27 crc kubenswrapper[4628]: I1211 05:35:27.153881 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-z4bpl" podStartSLOduration=2.153867465 podStartE2EDuration="2.153867465s" podCreationTimestamp="2025-12-11 05:35:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:35:27.153544497 +0000 UTC m=+1229.570891195" watchObservedRunningTime="2025-12-11 05:35:27.153867465 +0000 UTC m=+1229.571214163" Dec 11 05:35:27 crc kubenswrapper[4628]: I1211 05:35:27.263400 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 11 05:35:27 crc kubenswrapper[4628]: W1211 05:35:27.271683 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0091ba0_9c70_41dd_8f21_68968a10a308.slice/crio-9cdfe26ce15cb7feb47ddbfae8eccc2aef8b38527d0706d764aec7c31e4ea819 WatchSource:0}: Error finding container 9cdfe26ce15cb7feb47ddbfae8eccc2aef8b38527d0706d764aec7c31e4ea819: Status 404 returned error can't find the container with id 9cdfe26ce15cb7feb47ddbfae8eccc2aef8b38527d0706d764aec7c31e4ea819 Dec 11 05:35:27 crc kubenswrapper[4628]: I1211 05:35:27.927057 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="350c7aef-5a63-4478-b857-a2ad272d4d75" path="/var/lib/kubelet/pods/350c7aef-5a63-4478-b857-a2ad272d4d75/volumes" Dec 11 05:35:28 crc kubenswrapper[4628]: I1211 05:35:28.172907 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0091ba0-9c70-41dd-8f21-68968a10a308","Type":"ContainerStarted","Data":"9cdfe26ce15cb7feb47ddbfae8eccc2aef8b38527d0706d764aec7c31e4ea819"} Dec 11 05:35:29 crc kubenswrapper[4628]: I1211 05:35:29.189008 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0091ba0-9c70-41dd-8f21-68968a10a308","Type":"ContainerStarted","Data":"d5a547d0870c696de6445ecefda4442ec370f121c2c63d43d4f1e8c2dd5a7428"} Dec 11 05:35:30 crc kubenswrapper[4628]: I1211 05:35:30.208875 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0091ba0-9c70-41dd-8f21-68968a10a308","Type":"ContainerStarted","Data":"a7d5eb88ac774d9c96f29c146d5ea3e36fcd5a098005a5e094972bf65c268d24"} Dec 11 05:35:31 crc kubenswrapper[4628]: I1211 05:35:31.233182 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0091ba0-9c70-41dd-8f21-68968a10a308","Type":"ContainerStarted","Data":"a9e0e4dbff12e93ce209c28cace93a70de557e80b06a3e7f1184ca1f9d99b8e4"} Dec 11 05:35:31 crc kubenswrapper[4628]: I1211 05:35:31.485473 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 05:35:31 crc kubenswrapper[4628]: I1211 05:35:31.485531 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 05:35:32 crc kubenswrapper[4628]: I1211 05:35:32.244764 4628 generic.go:334] "Generic (PLEG): container finished" podID="2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af" containerID="8f835c5e648cc10d9e830c9db1308e78091fbd6c868b72b8ca834ea31b0a95c4" exitCode=0 Dec 11 05:35:32 crc kubenswrapper[4628]: I1211 05:35:32.244867 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z4bpl" event={"ID":"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af","Type":"ContainerDied","Data":"8f835c5e648cc10d9e830c9db1308e78091fbd6c868b72b8ca834ea31b0a95c4"} Dec 11 05:35:32 crc kubenswrapper[4628]: I1211 05:35:32.248156 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0091ba0-9c70-41dd-8f21-68968a10a308","Type":"ContainerStarted","Data":"03d4b7501d09dd830444153be7ac068444386a9394b389fddbb5509301c3017b"} Dec 11 05:35:32 crc kubenswrapper[4628]: I1211 05:35:32.248959 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 11 05:35:32 crc kubenswrapper[4628]: I1211 05:35:32.295068 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.357037712 podStartE2EDuration="6.295042264s" podCreationTimestamp="2025-12-11 05:35:26 +0000 UTC" firstStartedPulling="2025-12-11 05:35:27.273416737 +0000 UTC m=+1229.690763435" lastFinishedPulling="2025-12-11 05:35:31.211421289 +0000 UTC m=+1233.628767987" observedRunningTime="2025-12-11 05:35:32.28459465 +0000 UTC m=+1234.701941358" watchObservedRunningTime="2025-12-11 05:35:32.295042264 +0000 UTC m=+1234.712389002" Dec 11 05:35:32 crc kubenswrapper[4628]: I1211 05:35:32.501037 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 05:35:32 crc kubenswrapper[4628]: I1211 05:35:32.501068 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.198:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.710826 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.832993 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-config-data\") pod \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.833465 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-combined-ca-bundle\") pod \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.833492 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhspt\" (UniqueName: \"kubernetes.io/projected/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-kube-api-access-hhspt\") pod \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.833874 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-scripts\") pod \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\" (UID: \"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af\") " Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.855236 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-scripts" (OuterVolumeSpecName: "scripts") pod "2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af" (UID: "2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.855429 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-kube-api-access-hhspt" (OuterVolumeSpecName: "kube-api-access-hhspt") pod "2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af" (UID: "2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af"). InnerVolumeSpecName "kube-api-access-hhspt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.865046 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-config-data" (OuterVolumeSpecName: "config-data") pod "2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af" (UID: "2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.907978 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af" (UID: "2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.936413 4628 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-scripts\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.936440 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.936449 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:33 crc kubenswrapper[4628]: I1211 05:35:33.936460 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhspt\" (UniqueName: \"kubernetes.io/projected/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af-kube-api-access-hhspt\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:34 crc kubenswrapper[4628]: I1211 05:35:34.275951 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z4bpl" Dec 11 05:35:34 crc kubenswrapper[4628]: I1211 05:35:34.286760 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z4bpl" event={"ID":"2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af","Type":"ContainerDied","Data":"54f1a30daf833e3c13d9f4f062e023c38239b249d38ea5cf6b480d23bc91bf2e"} Dec 11 05:35:34 crc kubenswrapper[4628]: I1211 05:35:34.286814 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54f1a30daf833e3c13d9f4f062e023c38239b249d38ea5cf6b480d23bc91bf2e" Dec 11 05:35:34 crc kubenswrapper[4628]: I1211 05:35:34.444813 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:35:34 crc kubenswrapper[4628]: I1211 05:35:34.445275 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" containerName="nova-api-log" containerID="cri-o://49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0" gracePeriod=30 Dec 11 05:35:34 crc kubenswrapper[4628]: I1211 05:35:34.445805 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" containerName="nova-api-api" containerID="cri-o://3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0" gracePeriod=30 Dec 11 05:35:34 crc kubenswrapper[4628]: I1211 05:35:34.468005 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:35:34 crc kubenswrapper[4628]: I1211 05:35:34.468215 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="30bca213-0fd3-4ac3-b075-f3deb4b54cfd" containerName="nova-scheduler-scheduler" containerID="cri-o://1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5" gracePeriod=30 Dec 11 05:35:34 crc kubenswrapper[4628]: I1211 05:35:34.529553 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:35:34 crc kubenswrapper[4628]: I1211 05:35:34.530055 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" containerName="nova-metadata-metadata" containerID="cri-o://80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4" gracePeriod=30 Dec 11 05:35:34 crc kubenswrapper[4628]: I1211 05:35:34.530198 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" containerName="nova-metadata-log" containerID="cri-o://b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c" gracePeriod=30 Dec 11 05:35:35 crc kubenswrapper[4628]: I1211 05:35:35.286651 4628 generic.go:334] "Generic (PLEG): container finished" podID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" containerID="49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0" exitCode=143 Dec 11 05:35:35 crc kubenswrapper[4628]: I1211 05:35:35.286749 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c99175af-e6c5-4808-ae11-7b324c1a6e0b","Type":"ContainerDied","Data":"49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0"} Dec 11 05:35:35 crc kubenswrapper[4628]: I1211 05:35:35.288950 4628 generic.go:334] "Generic (PLEG): container finished" podID="40242d08-6531-48a3-8df9-5ee8b069011d" containerID="b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c" exitCode=143 Dec 11 05:35:35 crc kubenswrapper[4628]: I1211 05:35:35.289030 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40242d08-6531-48a3-8df9-5ee8b069011d","Type":"ContainerDied","Data":"b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c"} Dec 11 05:35:36 crc kubenswrapper[4628]: E1211 05:35:36.051357 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 11 05:35:36 crc kubenswrapper[4628]: E1211 05:35:36.053420 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 11 05:35:36 crc kubenswrapper[4628]: E1211 05:35:36.057203 4628 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 11 05:35:36 crc kubenswrapper[4628]: E1211 05:35:36.057444 4628 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="30bca213-0fd3-4ac3-b075-f3deb4b54cfd" containerName="nova-scheduler-scheduler" Dec 11 05:35:37 crc kubenswrapper[4628]: I1211 05:35:37.685250 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:59934->10.217.0.193:8775: read: connection reset by peer" Dec 11 05:35:37 crc kubenswrapper[4628]: I1211 05:35:37.686191 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:59944->10.217.0.193:8775: read: connection reset by peer" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.186992 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.193585 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.229440 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2stdq\" (UniqueName: \"kubernetes.io/projected/40242d08-6531-48a3-8df9-5ee8b069011d-kube-api-access-2stdq\") pod \"40242d08-6531-48a3-8df9-5ee8b069011d\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.229537 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40242d08-6531-48a3-8df9-5ee8b069011d-logs\") pod \"40242d08-6531-48a3-8df9-5ee8b069011d\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.229583 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvpfb\" (UniqueName: \"kubernetes.io/projected/c99175af-e6c5-4808-ae11-7b324c1a6e0b-kube-api-access-rvpfb\") pod \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.229624 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-combined-ca-bundle\") pod \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.229643 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-combined-ca-bundle\") pod \"40242d08-6531-48a3-8df9-5ee8b069011d\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.229665 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-config-data\") pod \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.229695 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c99175af-e6c5-4808-ae11-7b324c1a6e0b-logs\") pod \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.235027 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40242d08-6531-48a3-8df9-5ee8b069011d-logs" (OuterVolumeSpecName: "logs") pod "40242d08-6531-48a3-8df9-5ee8b069011d" (UID: "40242d08-6531-48a3-8df9-5ee8b069011d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.239020 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c99175af-e6c5-4808-ae11-7b324c1a6e0b-logs" (OuterVolumeSpecName: "logs") pod "c99175af-e6c5-4808-ae11-7b324c1a6e0b" (UID: "c99175af-e6c5-4808-ae11-7b324c1a6e0b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.248048 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-public-tls-certs\") pod \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.248120 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-internal-tls-certs\") pod \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\" (UID: \"c99175af-e6c5-4808-ae11-7b324c1a6e0b\") " Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.248241 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-nova-metadata-tls-certs\") pod \"40242d08-6531-48a3-8df9-5ee8b069011d\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.248266 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-config-data\") pod \"40242d08-6531-48a3-8df9-5ee8b069011d\" (UID: \"40242d08-6531-48a3-8df9-5ee8b069011d\") " Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.248901 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40242d08-6531-48a3-8df9-5ee8b069011d-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.248913 4628 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c99175af-e6c5-4808-ae11-7b324c1a6e0b-logs\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.257990 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40242d08-6531-48a3-8df9-5ee8b069011d-kube-api-access-2stdq" (OuterVolumeSpecName: "kube-api-access-2stdq") pod "40242d08-6531-48a3-8df9-5ee8b069011d" (UID: "40242d08-6531-48a3-8df9-5ee8b069011d"). InnerVolumeSpecName "kube-api-access-2stdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.258071 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99175af-e6c5-4808-ae11-7b324c1a6e0b-kube-api-access-rvpfb" (OuterVolumeSpecName: "kube-api-access-rvpfb") pod "c99175af-e6c5-4808-ae11-7b324c1a6e0b" (UID: "c99175af-e6c5-4808-ae11-7b324c1a6e0b"). InnerVolumeSpecName "kube-api-access-rvpfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.284497 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40242d08-6531-48a3-8df9-5ee8b069011d" (UID: "40242d08-6531-48a3-8df9-5ee8b069011d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.291400 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-config-data" (OuterVolumeSpecName: "config-data") pod "c99175af-e6c5-4808-ae11-7b324c1a6e0b" (UID: "c99175af-e6c5-4808-ae11-7b324c1a6e0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.308462 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c99175af-e6c5-4808-ae11-7b324c1a6e0b" (UID: "c99175af-e6c5-4808-ae11-7b324c1a6e0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.321297 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "40242d08-6531-48a3-8df9-5ee8b069011d" (UID: "40242d08-6531-48a3-8df9-5ee8b069011d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.323234 4628 generic.go:334] "Generic (PLEG): container finished" podID="40242d08-6531-48a3-8df9-5ee8b069011d" containerID="80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4" exitCode=0 Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.323320 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40242d08-6531-48a3-8df9-5ee8b069011d","Type":"ContainerDied","Data":"80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4"} Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.323460 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"40242d08-6531-48a3-8df9-5ee8b069011d","Type":"ContainerDied","Data":"6730b1b7e4f4e88bd92530fc88d7200e7081365b07ec5b47f6aaca0832c22abc"} Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.323482 4628 scope.go:117] "RemoveContainer" containerID="80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.323539 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.326626 4628 generic.go:334] "Generic (PLEG): container finished" podID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" containerID="3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0" exitCode=0 Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.326662 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c99175af-e6c5-4808-ae11-7b324c1a6e0b","Type":"ContainerDied","Data":"3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0"} Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.326689 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c99175af-e6c5-4808-ae11-7b324c1a6e0b","Type":"ContainerDied","Data":"d07872c05f27a59cb0e78830ecd11bc94c676c7f5fd7fc1d1fe98dd40965c962"} Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.326733 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.345979 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c99175af-e6c5-4808-ae11-7b324c1a6e0b" (UID: "c99175af-e6c5-4808-ae11-7b324c1a6e0b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.348823 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-config-data" (OuterVolumeSpecName: "config-data") pod "40242d08-6531-48a3-8df9-5ee8b069011d" (UID: "40242d08-6531-48a3-8df9-5ee8b069011d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.351216 4628 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.351236 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.351245 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2stdq\" (UniqueName: \"kubernetes.io/projected/40242d08-6531-48a3-8df9-5ee8b069011d-kube-api-access-2stdq\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.351254 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvpfb\" (UniqueName: \"kubernetes.io/projected/c99175af-e6c5-4808-ae11-7b324c1a6e0b-kube-api-access-rvpfb\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.351264 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.351273 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40242d08-6531-48a3-8df9-5ee8b069011d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.351281 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.351289 4628 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.375555 4628 scope.go:117] "RemoveContainer" containerID="b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.375758 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c99175af-e6c5-4808-ae11-7b324c1a6e0b" (UID: "c99175af-e6c5-4808-ae11-7b324c1a6e0b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.410397 4628 scope.go:117] "RemoveContainer" containerID="80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4" Dec 11 05:35:38 crc kubenswrapper[4628]: E1211 05:35:38.413124 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4\": container with ID starting with 80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4 not found: ID does not exist" containerID="80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.413157 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4"} err="failed to get container status \"80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4\": rpc error: code = NotFound desc = could not find container \"80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4\": container with ID starting with 80f80738f0e7de0b40c4235784d8a93bc94732cbf9f968ba69d0141dbe54d4b4 not found: ID does not exist" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.413181 4628 scope.go:117] "RemoveContainer" containerID="b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c" Dec 11 05:35:38 crc kubenswrapper[4628]: E1211 05:35:38.413366 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c\": container with ID starting with b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c not found: ID does not exist" containerID="b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.413386 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c"} err="failed to get container status \"b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c\": rpc error: code = NotFound desc = could not find container \"b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c\": container with ID starting with b1a0ce8a73c5b65861e65fc403f7cb16cc1f43c6fffad3100e732b5414b2c91c not found: ID does not exist" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.413401 4628 scope.go:117] "RemoveContainer" containerID="3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.431909 4628 scope.go:117] "RemoveContainer" containerID="49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.453295 4628 scope.go:117] "RemoveContainer" containerID="3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.453399 4628 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c99175af-e6c5-4808-ae11-7b324c1a6e0b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:38 crc kubenswrapper[4628]: E1211 05:35:38.453699 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0\": container with ID starting with 3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0 not found: ID does not exist" containerID="3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.453738 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0"} err="failed to get container status \"3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0\": rpc error: code = NotFound desc = could not find container \"3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0\": container with ID starting with 3b194296c4e844836ba0d0341e811b58823bcaf4506cd105a4b6de8b14177ba0 not found: ID does not exist" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.453766 4628 scope.go:117] "RemoveContainer" containerID="49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0" Dec 11 05:35:38 crc kubenswrapper[4628]: E1211 05:35:38.455104 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0\": container with ID starting with 49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0 not found: ID does not exist" containerID="49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.455135 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0"} err="failed to get container status \"49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0\": rpc error: code = NotFound desc = could not find container \"49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0\": container with ID starting with 49c0f3ff3b4dacb32da850fe31add5184cf9013931d67791257d530b51fa26b0 not found: ID does not exist" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.681368 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.700907 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.754939 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:35:38 crc kubenswrapper[4628]: E1211 05:35:38.755377 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" containerName="nova-metadata-log" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.755390 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" containerName="nova-metadata-log" Dec 11 05:35:38 crc kubenswrapper[4628]: E1211 05:35:38.755414 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af" containerName="nova-manage" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.755420 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af" containerName="nova-manage" Dec 11 05:35:38 crc kubenswrapper[4628]: E1211 05:35:38.755432 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" containerName="nova-api-log" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.755438 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" containerName="nova-api-log" Dec 11 05:35:38 crc kubenswrapper[4628]: E1211 05:35:38.755452 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" containerName="nova-api-api" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.755458 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" containerName="nova-api-api" Dec 11 05:35:38 crc kubenswrapper[4628]: E1211 05:35:38.755488 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" containerName="nova-metadata-metadata" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.755494 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" containerName="nova-metadata-metadata" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.755661 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" containerName="nova-metadata-metadata" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.755674 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" containerName="nova-metadata-log" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.755686 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af" containerName="nova-manage" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.755699 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" containerName="nova-api-api" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.755711 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" containerName="nova-api-log" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.756665 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.763136 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.767891 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.767953 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.772310 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.779896 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.788292 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.790193 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.792813 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.793086 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.793216 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.798342 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.862922 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a14c2329-c910-45b3-a28c-258f07a31c5f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.862972 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfa4c6b3-3b46-4383-8813-38a038d8e0da-config-data\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.863002 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14c2329-c910-45b3-a28c-258f07a31c5f-logs\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.863030 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfa4c6b3-3b46-4383-8813-38a038d8e0da-logs\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.863057 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbbpc\" (UniqueName: \"kubernetes.io/projected/a14c2329-c910-45b3-a28c-258f07a31c5f-kube-api-access-xbbpc\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.863076 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfa4c6b3-3b46-4383-8813-38a038d8e0da-public-tls-certs\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.863095 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfa4c6b3-3b46-4383-8813-38a038d8e0da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.863149 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfa4c6b3-3b46-4383-8813-38a038d8e0da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.863175 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14c2329-c910-45b3-a28c-258f07a31c5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.863193 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbch5\" (UniqueName: \"kubernetes.io/projected/dfa4c6b3-3b46-4383-8813-38a038d8e0da-kube-api-access-gbch5\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.863211 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14c2329-c910-45b3-a28c-258f07a31c5f-config-data\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.964606 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbch5\" (UniqueName: \"kubernetes.io/projected/dfa4c6b3-3b46-4383-8813-38a038d8e0da-kube-api-access-gbch5\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.965150 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14c2329-c910-45b3-a28c-258f07a31c5f-config-data\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.966441 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a14c2329-c910-45b3-a28c-258f07a31c5f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.967080 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfa4c6b3-3b46-4383-8813-38a038d8e0da-config-data\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.967255 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14c2329-c910-45b3-a28c-258f07a31c5f-logs\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.967391 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfa4c6b3-3b46-4383-8813-38a038d8e0da-logs\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.967517 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbbpc\" (UniqueName: \"kubernetes.io/projected/a14c2329-c910-45b3-a28c-258f07a31c5f-kube-api-access-xbbpc\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.967639 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfa4c6b3-3b46-4383-8813-38a038d8e0da-public-tls-certs\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.967763 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfa4c6b3-3b46-4383-8813-38a038d8e0da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.967986 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14c2329-c910-45b3-a28c-258f07a31c5f-logs\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.968566 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfa4c6b3-3b46-4383-8813-38a038d8e0da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.968723 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14c2329-c910-45b3-a28c-258f07a31c5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.968604 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfa4c6b3-3b46-4383-8813-38a038d8e0da-logs\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.972217 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfa4c6b3-3b46-4383-8813-38a038d8e0da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.972360 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14c2329-c910-45b3-a28c-258f07a31c5f-config-data\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.972433 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfa4c6b3-3b46-4383-8813-38a038d8e0da-public-tls-certs\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.972904 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfa4c6b3-3b46-4383-8813-38a038d8e0da-config-data\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.973913 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a14c2329-c910-45b3-a28c-258f07a31c5f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.981712 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfa4c6b3-3b46-4383-8813-38a038d8e0da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.981927 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14c2329-c910-45b3-a28c-258f07a31c5f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.984729 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbch5\" (UniqueName: \"kubernetes.io/projected/dfa4c6b3-3b46-4383-8813-38a038d8e0da-kube-api-access-gbch5\") pod \"nova-api-0\" (UID: \"dfa4c6b3-3b46-4383-8813-38a038d8e0da\") " pod="openstack/nova-api-0" Dec 11 05:35:38 crc kubenswrapper[4628]: I1211 05:35:38.991406 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbbpc\" (UniqueName: \"kubernetes.io/projected/a14c2329-c910-45b3-a28c-258f07a31c5f-kube-api-access-xbbpc\") pod \"nova-metadata-0\" (UID: \"a14c2329-c910-45b3-a28c-258f07a31c5f\") " pod="openstack/nova-metadata-0" Dec 11 05:35:39 crc kubenswrapper[4628]: I1211 05:35:39.104352 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 11 05:35:39 crc kubenswrapper[4628]: I1211 05:35:39.114646 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 11 05:35:39 crc kubenswrapper[4628]: I1211 05:35:39.596578 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 11 05:35:39 crc kubenswrapper[4628]: W1211 05:35:39.604759 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda14c2329_c910_45b3_a28c_258f07a31c5f.slice/crio-e055ad62c11f580f2a964af5952d7f744f743209c45ad6b17ba914172a945f68 WatchSource:0}: Error finding container e055ad62c11f580f2a964af5952d7f744f743209c45ad6b17ba914172a945f68: Status 404 returned error can't find the container with id e055ad62c11f580f2a964af5952d7f744f743209c45ad6b17ba914172a945f68 Dec 11 05:35:39 crc kubenswrapper[4628]: I1211 05:35:39.683281 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 11 05:35:39 crc kubenswrapper[4628]: I1211 05:35:39.900227 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40242d08-6531-48a3-8df9-5ee8b069011d" path="/var/lib/kubelet/pods/40242d08-6531-48a3-8df9-5ee8b069011d/volumes" Dec 11 05:35:39 crc kubenswrapper[4628]: I1211 05:35:39.900837 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c99175af-e6c5-4808-ae11-7b324c1a6e0b" path="/var/lib/kubelet/pods/c99175af-e6c5-4808-ae11-7b324c1a6e0b/volumes" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.269408 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.345756 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dfa4c6b3-3b46-4383-8813-38a038d8e0da","Type":"ContainerStarted","Data":"e4aa65db2b3eb66be922523876a5e0242115d6c91eec89b293a138069f4e8f40"} Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.345802 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dfa4c6b3-3b46-4383-8813-38a038d8e0da","Type":"ContainerStarted","Data":"a0bbaabcc8bb41b6f02f62e76f011653190c602aee572d57bd7927d18af0b997"} Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.345813 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dfa4c6b3-3b46-4383-8813-38a038d8e0da","Type":"ContainerStarted","Data":"7cc85281dd40ad2545070da28733f733aadc2921b1078d1a7d6dd9f569a737b4"} Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.350761 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14c2329-c910-45b3-a28c-258f07a31c5f","Type":"ContainerStarted","Data":"405780cf1f59974e57cada32485b1ad769c8c53771ff5f1e4681ecd5e808fd56"} Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.350796 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14c2329-c910-45b3-a28c-258f07a31c5f","Type":"ContainerStarted","Data":"2203b3607ba38962231421cbb202e7726797a77db1648dbb08697fff6b78a8f7"} Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.350807 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14c2329-c910-45b3-a28c-258f07a31c5f","Type":"ContainerStarted","Data":"e055ad62c11f580f2a964af5952d7f744f743209c45ad6b17ba914172a945f68"} Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.351967 4628 generic.go:334] "Generic (PLEG): container finished" podID="30bca213-0fd3-4ac3-b075-f3deb4b54cfd" containerID="1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5" exitCode=0 Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.352065 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30bca213-0fd3-4ac3-b075-f3deb4b54cfd","Type":"ContainerDied","Data":"1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5"} Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.352149 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30bca213-0fd3-4ac3-b075-f3deb4b54cfd","Type":"ContainerDied","Data":"10d51de80aeebbabeebd649708416a2e0e2c995d4b092ea0c5e69e90ad9f3b65"} Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.352221 4628 scope.go:117] "RemoveContainer" containerID="1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.352389 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.370368 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.370347501 podStartE2EDuration="2.370347501s" podCreationTimestamp="2025-12-11 05:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:35:40.364657596 +0000 UTC m=+1242.782004294" watchObservedRunningTime="2025-12-11 05:35:40.370347501 +0000 UTC m=+1242.787694199" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.392036 4628 scope.go:117] "RemoveContainer" containerID="1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5" Dec 11 05:35:40 crc kubenswrapper[4628]: E1211 05:35:40.397278 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5\": container with ID starting with 1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5 not found: ID does not exist" containerID="1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.405782 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5"} err="failed to get container status \"1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5\": rpc error: code = NotFound desc = could not find container \"1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5\": container with ID starting with 1f65d35112a7aa91a6f42a6dd377d163fb54a5bed65c8b8e355dacf4b3fd4ba5 not found: ID does not exist" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.402470 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-config-data\") pod \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\" (UID: \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\") " Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.406189 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knjj5\" (UniqueName: \"kubernetes.io/projected/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-kube-api-access-knjj5\") pod \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\" (UID: \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\") " Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.396739 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.396717559 podStartE2EDuration="2.396717559s" podCreationTimestamp="2025-12-11 05:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:35:40.388406512 +0000 UTC m=+1242.805753210" watchObservedRunningTime="2025-12-11 05:35:40.396717559 +0000 UTC m=+1242.814064257" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.406475 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-combined-ca-bundle\") pod \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\" (UID: \"30bca213-0fd3-4ac3-b075-f3deb4b54cfd\") " Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.413585 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-kube-api-access-knjj5" (OuterVolumeSpecName: "kube-api-access-knjj5") pod "30bca213-0fd3-4ac3-b075-f3deb4b54cfd" (UID: "30bca213-0fd3-4ac3-b075-f3deb4b54cfd"). InnerVolumeSpecName "kube-api-access-knjj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.435570 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30bca213-0fd3-4ac3-b075-f3deb4b54cfd" (UID: "30bca213-0fd3-4ac3-b075-f3deb4b54cfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.472186 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-config-data" (OuterVolumeSpecName: "config-data") pod "30bca213-0fd3-4ac3-b075-f3deb4b54cfd" (UID: "30bca213-0fd3-4ac3-b075-f3deb4b54cfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.510276 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.510321 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knjj5\" (UniqueName: \"kubernetes.io/projected/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-kube-api-access-knjj5\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.510336 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30bca213-0fd3-4ac3-b075-f3deb4b54cfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.678186 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.685866 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.693964 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:35:40 crc kubenswrapper[4628]: E1211 05:35:40.694507 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30bca213-0fd3-4ac3-b075-f3deb4b54cfd" containerName="nova-scheduler-scheduler" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.694530 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="30bca213-0fd3-4ac3-b075-f3deb4b54cfd" containerName="nova-scheduler-scheduler" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.694993 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="30bca213-0fd3-4ac3-b075-f3deb4b54cfd" containerName="nova-scheduler-scheduler" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.695616 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.699710 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.716265 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.815017 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997489dd-97c9-4359-9920-8bdb512f708b-config-data\") pod \"nova-scheduler-0\" (UID: \"997489dd-97c9-4359-9920-8bdb512f708b\") " pod="openstack/nova-scheduler-0" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.815386 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997489dd-97c9-4359-9920-8bdb512f708b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"997489dd-97c9-4359-9920-8bdb512f708b\") " pod="openstack/nova-scheduler-0" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.815516 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr7zs\" (UniqueName: \"kubernetes.io/projected/997489dd-97c9-4359-9920-8bdb512f708b-kube-api-access-tr7zs\") pod \"nova-scheduler-0\" (UID: \"997489dd-97c9-4359-9920-8bdb512f708b\") " pod="openstack/nova-scheduler-0" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.916623 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997489dd-97c9-4359-9920-8bdb512f708b-config-data\") pod \"nova-scheduler-0\" (UID: \"997489dd-97c9-4359-9920-8bdb512f708b\") " pod="openstack/nova-scheduler-0" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.916760 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997489dd-97c9-4359-9920-8bdb512f708b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"997489dd-97c9-4359-9920-8bdb512f708b\") " pod="openstack/nova-scheduler-0" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.916784 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr7zs\" (UniqueName: \"kubernetes.io/projected/997489dd-97c9-4359-9920-8bdb512f708b-kube-api-access-tr7zs\") pod \"nova-scheduler-0\" (UID: \"997489dd-97c9-4359-9920-8bdb512f708b\") " pod="openstack/nova-scheduler-0" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.921416 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/997489dd-97c9-4359-9920-8bdb512f708b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"997489dd-97c9-4359-9920-8bdb512f708b\") " pod="openstack/nova-scheduler-0" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.921655 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/997489dd-97c9-4359-9920-8bdb512f708b-config-data\") pod \"nova-scheduler-0\" (UID: \"997489dd-97c9-4359-9920-8bdb512f708b\") " pod="openstack/nova-scheduler-0" Dec 11 05:35:40 crc kubenswrapper[4628]: I1211 05:35:40.935629 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr7zs\" (UniqueName: \"kubernetes.io/projected/997489dd-97c9-4359-9920-8bdb512f708b-kube-api-access-tr7zs\") pod \"nova-scheduler-0\" (UID: \"997489dd-97c9-4359-9920-8bdb512f708b\") " pod="openstack/nova-scheduler-0" Dec 11 05:35:41 crc kubenswrapper[4628]: I1211 05:35:41.069519 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 11 05:35:41 crc kubenswrapper[4628]: I1211 05:35:41.548632 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 11 05:35:41 crc kubenswrapper[4628]: W1211 05:35:41.550129 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod997489dd_97c9_4359_9920_8bdb512f708b.slice/crio-31c1e99b5a62210ae0309889bb8b04e9f88fe804a6aa3020d1df9755d90b43be WatchSource:0}: Error finding container 31c1e99b5a62210ae0309889bb8b04e9f88fe804a6aa3020d1df9755d90b43be: Status 404 returned error can't find the container with id 31c1e99b5a62210ae0309889bb8b04e9f88fe804a6aa3020d1df9755d90b43be Dec 11 05:35:41 crc kubenswrapper[4628]: I1211 05:35:41.902125 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30bca213-0fd3-4ac3-b075-f3deb4b54cfd" path="/var/lib/kubelet/pods/30bca213-0fd3-4ac3-b075-f3deb4b54cfd/volumes" Dec 11 05:35:42 crc kubenswrapper[4628]: I1211 05:35:42.372694 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"997489dd-97c9-4359-9920-8bdb512f708b","Type":"ContainerStarted","Data":"b7a5da6cd7524035f817b1ce2f3021e205ad453aa57e6fd7f019772c0cdc053f"} Dec 11 05:35:42 crc kubenswrapper[4628]: I1211 05:35:42.372770 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"997489dd-97c9-4359-9920-8bdb512f708b","Type":"ContainerStarted","Data":"31c1e99b5a62210ae0309889bb8b04e9f88fe804a6aa3020d1df9755d90b43be"} Dec 11 05:35:42 crc kubenswrapper[4628]: I1211 05:35:42.411325 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.411272551 podStartE2EDuration="2.411272551s" podCreationTimestamp="2025-12-11 05:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:35:42.40423738 +0000 UTC m=+1244.821584098" watchObservedRunningTime="2025-12-11 05:35:42.411272551 +0000 UTC m=+1244.828619279" Dec 11 05:35:44 crc kubenswrapper[4628]: I1211 05:35:44.104653 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 11 05:35:44 crc kubenswrapper[4628]: I1211 05:35:44.105565 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 11 05:35:46 crc kubenswrapper[4628]: I1211 05:35:46.070445 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 11 05:35:49 crc kubenswrapper[4628]: I1211 05:35:49.104731 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 11 05:35:49 crc kubenswrapper[4628]: I1211 05:35:49.105133 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 11 05:35:49 crc kubenswrapper[4628]: I1211 05:35:49.115961 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 05:35:49 crc kubenswrapper[4628]: I1211 05:35:49.116030 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 11 05:35:50 crc kubenswrapper[4628]: I1211 05:35:50.119000 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a14c2329-c910-45b3-a28c-258f07a31c5f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 05:35:50 crc kubenswrapper[4628]: I1211 05:35:50.119029 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a14c2329-c910-45b3-a28c-258f07a31c5f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 05:35:50 crc kubenswrapper[4628]: I1211 05:35:50.132970 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dfa4c6b3-3b46-4383-8813-38a038d8e0da" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 05:35:50 crc kubenswrapper[4628]: I1211 05:35:50.133166 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dfa4c6b3-3b46-4383-8813-38a038d8e0da" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 05:35:51 crc kubenswrapper[4628]: I1211 05:35:51.070947 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 11 05:35:51 crc kubenswrapper[4628]: I1211 05:35:51.118634 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 11 05:35:51 crc kubenswrapper[4628]: I1211 05:35:51.500271 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 11 05:35:56 crc kubenswrapper[4628]: I1211 05:35:56.821539 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 11 05:35:59 crc kubenswrapper[4628]: I1211 05:35:59.111547 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 11 05:35:59 crc kubenswrapper[4628]: I1211 05:35:59.116375 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 11 05:35:59 crc kubenswrapper[4628]: I1211 05:35:59.118046 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 11 05:35:59 crc kubenswrapper[4628]: I1211 05:35:59.134824 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 11 05:35:59 crc kubenswrapper[4628]: I1211 05:35:59.136207 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 11 05:35:59 crc kubenswrapper[4628]: I1211 05:35:59.206696 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 11 05:35:59 crc kubenswrapper[4628]: I1211 05:35:59.487245 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 11 05:35:59 crc kubenswrapper[4628]: I1211 05:35:59.537915 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 11 05:35:59 crc kubenswrapper[4628]: I1211 05:35:59.542882 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 11 05:35:59 crc kubenswrapper[4628]: I1211 05:35:59.544190 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 11 05:36:07 crc kubenswrapper[4628]: I1211 05:36:07.370709 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 05:36:08 crc kubenswrapper[4628]: I1211 05:36:08.833140 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 05:36:12 crc kubenswrapper[4628]: I1211 05:36:12.318856 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5279d32c-7625-460c-881b-243e69077070" containerName="rabbitmq" containerID="cri-o://866489e3cb37534a9f3a2f24ee29b00df3ac27c7ba0aba4f75e188125089cd72" gracePeriod=604796 Dec 11 05:36:13 crc kubenswrapper[4628]: I1211 05:36:13.322202 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a07218df-1f25-47c4-89dc-2c7ce7f406ac" containerName="rabbitmq" containerID="cri-o://18ac8dd9c99eb617ae753c989348e0de63c5c48434de8d8b5cdd717035157858" gracePeriod=604796 Dec 11 05:36:18 crc kubenswrapper[4628]: I1211 05:36:18.312650 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5279d32c-7625-460c-881b-243e69077070" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 11 05:36:18 crc kubenswrapper[4628]: I1211 05:36:18.714729 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a07218df-1f25-47c4-89dc-2c7ce7f406ac" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Dec 11 05:36:18 crc kubenswrapper[4628]: I1211 05:36:18.738122 4628 generic.go:334] "Generic (PLEG): container finished" podID="5279d32c-7625-460c-881b-243e69077070" containerID="866489e3cb37534a9f3a2f24ee29b00df3ac27c7ba0aba4f75e188125089cd72" exitCode=0 Dec 11 05:36:18 crc kubenswrapper[4628]: I1211 05:36:18.738165 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5279d32c-7625-460c-881b-243e69077070","Type":"ContainerDied","Data":"866489e3cb37534a9f3a2f24ee29b00df3ac27c7ba0aba4f75e188125089cd72"} Dec 11 05:36:18 crc kubenswrapper[4628]: I1211 05:36:18.870018 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.034318 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5279d32c-7625-460c-881b-243e69077070-erlang-cookie-secret\") pod \"5279d32c-7625-460c-881b-243e69077070\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.034402 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5279d32c-7625-460c-881b-243e69077070-rabbitmq-plugins\") pod \"5279d32c-7625-460c-881b-243e69077070\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.034428 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-plugins-conf\") pod \"5279d32c-7625-460c-881b-243e69077070\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.034463 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5279d32c-7625-460c-881b-243e69077070-pod-info\") pod \"5279d32c-7625-460c-881b-243e69077070\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.034483 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8mrt\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-kube-api-access-p8mrt\") pod \"5279d32c-7625-460c-881b-243e69077070\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.034518 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-server-conf\") pod \"5279d32c-7625-460c-881b-243e69077070\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.034540 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-rabbitmq-confd\") pod \"5279d32c-7625-460c-881b-243e69077070\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.034630 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5279d32c-7625-460c-881b-243e69077070-rabbitmq-erlang-cookie\") pod \"5279d32c-7625-460c-881b-243e69077070\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.034693 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"5279d32c-7625-460c-881b-243e69077070\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.034742 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-config-data\") pod \"5279d32c-7625-460c-881b-243e69077070\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.034797 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-rabbitmq-tls\") pod \"5279d32c-7625-460c-881b-243e69077070\" (UID: \"5279d32c-7625-460c-881b-243e69077070\") " Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.035935 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5279d32c-7625-460c-881b-243e69077070-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5279d32c-7625-460c-881b-243e69077070" (UID: "5279d32c-7625-460c-881b-243e69077070"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.036563 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5279d32c-7625-460c-881b-243e69077070" (UID: "5279d32c-7625-460c-881b-243e69077070"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.037082 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5279d32c-7625-460c-881b-243e69077070-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5279d32c-7625-460c-881b-243e69077070" (UID: "5279d32c-7625-460c-881b-243e69077070"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.043739 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5279d32c-7625-460c-881b-243e69077070-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5279d32c-7625-460c-881b-243e69077070" (UID: "5279d32c-7625-460c-881b-243e69077070"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.044785 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-kube-api-access-p8mrt" (OuterVolumeSpecName: "kube-api-access-p8mrt") pod "5279d32c-7625-460c-881b-243e69077070" (UID: "5279d32c-7625-460c-881b-243e69077070"). InnerVolumeSpecName "kube-api-access-p8mrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.045348 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "5279d32c-7625-460c-881b-243e69077070" (UID: "5279d32c-7625-460c-881b-243e69077070"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.045985 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5279d32c-7625-460c-881b-243e69077070-pod-info" (OuterVolumeSpecName: "pod-info") pod "5279d32c-7625-460c-881b-243e69077070" (UID: "5279d32c-7625-460c-881b-243e69077070"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.047379 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5279d32c-7625-460c-881b-243e69077070" (UID: "5279d32c-7625-460c-881b-243e69077070"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.129499 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-config-data" (OuterVolumeSpecName: "config-data") pod "5279d32c-7625-460c-881b-243e69077070" (UID: "5279d32c-7625-460c-881b-243e69077070"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.138185 4628 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.138221 4628 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5279d32c-7625-460c-881b-243e69077070-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.138272 4628 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5279d32c-7625-460c-881b-243e69077070-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.138280 4628 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.138290 4628 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5279d32c-7625-460c-881b-243e69077070-pod-info\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.138299 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8mrt\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-kube-api-access-p8mrt\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.138308 4628 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5279d32c-7625-460c-881b-243e69077070-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.138331 4628 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.138340 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.164314 4628 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.166383 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-server-conf" (OuterVolumeSpecName: "server-conf") pod "5279d32c-7625-460c-881b-243e69077070" (UID: "5279d32c-7625-460c-881b-243e69077070"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.212523 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5279d32c-7625-460c-881b-243e69077070" (UID: "5279d32c-7625-460c-881b-243e69077070"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.239700 4628 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5279d32c-7625-460c-881b-243e69077070-server-conf\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.239730 4628 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5279d32c-7625-460c-881b-243e69077070-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.239747 4628 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.764218 4628 generic.go:334] "Generic (PLEG): container finished" podID="a07218df-1f25-47c4-89dc-2c7ce7f406ac" containerID="18ac8dd9c99eb617ae753c989348e0de63c5c48434de8d8b5cdd717035157858" exitCode=0 Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.764599 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a07218df-1f25-47c4-89dc-2c7ce7f406ac","Type":"ContainerDied","Data":"18ac8dd9c99eb617ae753c989348e0de63c5c48434de8d8b5cdd717035157858"} Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.766099 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5279d32c-7625-460c-881b-243e69077070","Type":"ContainerDied","Data":"79ba71801b18823c7e6b1221d3ed727d096019e948b7b88cbdc863082e6dbfe9"} Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.766126 4628 scope.go:117] "RemoveContainer" containerID="866489e3cb37534a9f3a2f24ee29b00df3ac27c7ba0aba4f75e188125089cd72" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.766328 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.820173 4628 scope.go:117] "RemoveContainer" containerID="86cdb42df246a58a1bcb275c5570adfa9b9a943b1d21a98085ada9bb6063ed40" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.832959 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.864907 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-6vmvj"] Dec 11 05:36:19 crc kubenswrapper[4628]: E1211 05:36:19.865391 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5279d32c-7625-460c-881b-243e69077070" containerName="rabbitmq" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.865407 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="5279d32c-7625-460c-881b-243e69077070" containerName="rabbitmq" Dec 11 05:36:19 crc kubenswrapper[4628]: E1211 05:36:19.865426 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5279d32c-7625-460c-881b-243e69077070" containerName="setup-container" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.865432 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="5279d32c-7625-460c-881b-243e69077070" containerName="setup-container" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.865617 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="5279d32c-7625-460c-881b-243e69077070" containerName="rabbitmq" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.866591 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.872625 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.886920 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.913674 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5279d32c-7625-460c-881b-243e69077070" path="/var/lib/kubelet/pods/5279d32c-7625-460c-881b-243e69077070/volumes" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.920527 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-6vmvj"] Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.939949 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.941572 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.947306 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.947508 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.947667 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.947668 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.947791 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.947791 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lcmx8" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.947741 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.972386 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 05:36:19 crc kubenswrapper[4628]: I1211 05:36:19.986953 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071218 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c89e316-b7b8-4740-aa49-0c21052a51de-config-data\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071311 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071345 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071399 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c89e316-b7b8-4740-aa49-0c21052a51de-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071452 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071483 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8k9p\" (UniqueName: \"kubernetes.io/projected/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-kube-api-access-f8k9p\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071514 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071610 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071702 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071724 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c89e316-b7b8-4740-aa49-0c21052a51de-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071758 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7p75\" (UniqueName: \"kubernetes.io/projected/3c89e316-b7b8-4740-aa49-0c21052a51de-kube-api-access-c7p75\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071776 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c89e316-b7b8-4740-aa49-0c21052a51de-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071801 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c89e316-b7b8-4740-aa49-0c21052a51de-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071815 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-config\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071857 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c89e316-b7b8-4740-aa49-0c21052a51de-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071877 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c89e316-b7b8-4740-aa49-0c21052a51de-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071899 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c89e316-b7b8-4740-aa49-0c21052a51de-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.071936 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c89e316-b7b8-4740-aa49-0c21052a51de-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.174012 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-erlang-cookie\") pod \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.174427 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-server-conf\") pod \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.174454 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-config-data\") pod \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.174481 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvz72\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-kube-api-access-zvz72\") pod \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.174541 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a07218df-1f25-47c4-89dc-2c7ce7f406ac-pod-info\") pod \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.174582 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-plugins\") pod \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.174656 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-tls\") pod \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.174682 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-confd\") pod \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.174741 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a07218df-1f25-47c4-89dc-2c7ce7f406ac-erlang-cookie-secret\") pod \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.174774 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.174834 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-plugins-conf\") pod \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\" (UID: \"a07218df-1f25-47c4-89dc-2c7ce7f406ac\") " Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175135 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175210 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175235 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c89e316-b7b8-4740-aa49-0c21052a51de-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175269 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7p75\" (UniqueName: \"kubernetes.io/projected/3c89e316-b7b8-4740-aa49-0c21052a51de-kube-api-access-c7p75\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175290 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c89e316-b7b8-4740-aa49-0c21052a51de-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175315 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c89e316-b7b8-4740-aa49-0c21052a51de-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175339 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-config\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175366 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c89e316-b7b8-4740-aa49-0c21052a51de-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175387 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c89e316-b7b8-4740-aa49-0c21052a51de-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175410 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c89e316-b7b8-4740-aa49-0c21052a51de-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175447 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c89e316-b7b8-4740-aa49-0c21052a51de-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175497 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c89e316-b7b8-4740-aa49-0c21052a51de-config-data\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175533 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175566 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175611 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c89e316-b7b8-4740-aa49-0c21052a51de-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175658 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175692 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8k9p\" (UniqueName: \"kubernetes.io/projected/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-kube-api-access-f8k9p\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.175724 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.176075 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.176327 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a07218df-1f25-47c4-89dc-2c7ce7f406ac" (UID: "a07218df-1f25-47c4-89dc-2c7ce7f406ac"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.176558 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.177350 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.177374 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a07218df-1f25-47c4-89dc-2c7ce7f406ac" (UID: "a07218df-1f25-47c4-89dc-2c7ce7f406ac"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.177679 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a07218df-1f25-47c4-89dc-2c7ce7f406ac" (UID: "a07218df-1f25-47c4-89dc-2c7ce7f406ac"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.178313 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.184168 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c89e316-b7b8-4740-aa49-0c21052a51de-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.186875 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.187579 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c89e316-b7b8-4740-aa49-0c21052a51de-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.187815 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-config\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.188127 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c89e316-b7b8-4740-aa49-0c21052a51de-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.188369 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c89e316-b7b8-4740-aa49-0c21052a51de-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.188384 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.188556 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c89e316-b7b8-4740-aa49-0c21052a51de-config-data\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.188908 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a07218df-1f25-47c4-89dc-2c7ce7f406ac-pod-info" (OuterVolumeSpecName: "pod-info") pod "a07218df-1f25-47c4-89dc-2c7ce7f406ac" (UID: "a07218df-1f25-47c4-89dc-2c7ce7f406ac"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.199997 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c89e316-b7b8-4740-aa49-0c21052a51de-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.205458 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "a07218df-1f25-47c4-89dc-2c7ce7f406ac" (UID: "a07218df-1f25-47c4-89dc-2c7ce7f406ac"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.206048 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c89e316-b7b8-4740-aa49-0c21052a51de-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.206135 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7p75\" (UniqueName: \"kubernetes.io/projected/3c89e316-b7b8-4740-aa49-0c21052a51de-kube-api-access-c7p75\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.206244 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-kube-api-access-zvz72" (OuterVolumeSpecName: "kube-api-access-zvz72") pod "a07218df-1f25-47c4-89dc-2c7ce7f406ac" (UID: "a07218df-1f25-47c4-89dc-2c7ce7f406ac"). InnerVolumeSpecName "kube-api-access-zvz72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.208355 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8k9p\" (UniqueName: \"kubernetes.io/projected/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-kube-api-access-f8k9p\") pod \"dnsmasq-dns-79bd4cc8c9-6vmvj\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.210018 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a07218df-1f25-47c4-89dc-2c7ce7f406ac-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a07218df-1f25-47c4-89dc-2c7ce7f406ac" (UID: "a07218df-1f25-47c4-89dc-2c7ce7f406ac"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.210323 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c89e316-b7b8-4740-aa49-0c21052a51de-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.210496 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a07218df-1f25-47c4-89dc-2c7ce7f406ac" (UID: "a07218df-1f25-47c4-89dc-2c7ce7f406ac"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.212218 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c89e316-b7b8-4740-aa49-0c21052a51de-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.254574 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-config-data" (OuterVolumeSpecName: "config-data") pod "a07218df-1f25-47c4-89dc-2c7ce7f406ac" (UID: "a07218df-1f25-47c4-89dc-2c7ce7f406ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.255148 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-server-conf" (OuterVolumeSpecName: "server-conf") pod "a07218df-1f25-47c4-89dc-2c7ce7f406ac" (UID: "a07218df-1f25-47c4-89dc-2c7ce7f406ac"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.260457 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"3c89e316-b7b8-4740-aa49-0c21052a51de\") " pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.277464 4628 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.277495 4628 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.277507 4628 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-server-conf\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.277515 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a07218df-1f25-47c4-89dc-2c7ce7f406ac-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.277523 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvz72\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-kube-api-access-zvz72\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.277531 4628 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a07218df-1f25-47c4-89dc-2c7ce7f406ac-pod-info\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.277541 4628 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.277551 4628 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.277559 4628 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a07218df-1f25-47c4-89dc-2c7ce7f406ac-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.277591 4628 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.288137 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.298424 4628 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.306236 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.379629 4628 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.424190 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a07218df-1f25-47c4-89dc-2c7ce7f406ac" (UID: "a07218df-1f25-47c4-89dc-2c7ce7f406ac"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.481179 4628 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a07218df-1f25-47c4-89dc-2c7ce7f406ac-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.569983 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-6vmvj"] Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.677328 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.780155 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" event={"ID":"86b06a41-e8a1-4815-b7f1-bcf104c0eebb","Type":"ContainerStarted","Data":"be8463023f03eb99f60943f135da3cf6f7b5fbe51d3bf6e9d2fe57c88979277c"} Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.786215 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3c89e316-b7b8-4740-aa49-0c21052a51de","Type":"ContainerStarted","Data":"76d4250e0ba7e414559e5a2a069c641db0c4c4641ff6dff018e5bfe63fb0177c"} Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.788922 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a07218df-1f25-47c4-89dc-2c7ce7f406ac","Type":"ContainerDied","Data":"1f4cdf3765a685d6f721381dcccdc3d0d1cac66291bf30cbf2286158eb9fb311"} Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.788991 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.789183 4628 scope.go:117] "RemoveContainer" containerID="18ac8dd9c99eb617ae753c989348e0de63c5c48434de8d8b5cdd717035157858" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.812776 4628 scope.go:117] "RemoveContainer" containerID="fbb2f6ff2b4cf940c6af7bddcc8de8efd9bce8d4d8220bccb23d4c9966e7b818" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.839908 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.848376 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.888692 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 05:36:20 crc kubenswrapper[4628]: E1211 05:36:20.889264 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07218df-1f25-47c4-89dc-2c7ce7f406ac" containerName="rabbitmq" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.889282 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07218df-1f25-47c4-89dc-2c7ce7f406ac" containerName="rabbitmq" Dec 11 05:36:20 crc kubenswrapper[4628]: E1211 05:36:20.889295 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07218df-1f25-47c4-89dc-2c7ce7f406ac" containerName="setup-container" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.889304 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07218df-1f25-47c4-89dc-2c7ce7f406ac" containerName="setup-container" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.889512 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="a07218df-1f25-47c4-89dc-2c7ce7f406ac" containerName="rabbitmq" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.890453 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.894343 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.894506 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.894765 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.900095 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.900277 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.900648 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.900926 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hkk9m" Dec 11 05:36:20 crc kubenswrapper[4628]: I1211 05:36:20.908673 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.003043 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtszr\" (UniqueName: \"kubernetes.io/projected/38ba9ced-55a9-40ad-8581-45f8d87da5ef-kube-api-access-jtszr\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.003415 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38ba9ced-55a9-40ad-8581-45f8d87da5ef-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.003574 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38ba9ced-55a9-40ad-8581-45f8d87da5ef-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.004668 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38ba9ced-55a9-40ad-8581-45f8d87da5ef-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.004794 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38ba9ced-55a9-40ad-8581-45f8d87da5ef-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.005517 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.005802 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38ba9ced-55a9-40ad-8581-45f8d87da5ef-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.006034 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38ba9ced-55a9-40ad-8581-45f8d87da5ef-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.006184 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38ba9ced-55a9-40ad-8581-45f8d87da5ef-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.006321 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38ba9ced-55a9-40ad-8581-45f8d87da5ef-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.006422 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38ba9ced-55a9-40ad-8581-45f8d87da5ef-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.108068 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38ba9ced-55a9-40ad-8581-45f8d87da5ef-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.109405 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38ba9ced-55a9-40ad-8581-45f8d87da5ef-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.109344 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38ba9ced-55a9-40ad-8581-45f8d87da5ef-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.110023 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38ba9ced-55a9-40ad-8581-45f8d87da5ef-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.110208 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.110535 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.110873 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38ba9ced-55a9-40ad-8581-45f8d87da5ef-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.111312 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38ba9ced-55a9-40ad-8581-45f8d87da5ef-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.111404 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38ba9ced-55a9-40ad-8581-45f8d87da5ef-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.111489 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38ba9ced-55a9-40ad-8581-45f8d87da5ef-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.111103 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38ba9ced-55a9-40ad-8581-45f8d87da5ef-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.111614 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38ba9ced-55a9-40ad-8581-45f8d87da5ef-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.111732 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtszr\" (UniqueName: \"kubernetes.io/projected/38ba9ced-55a9-40ad-8581-45f8d87da5ef-kube-api-access-jtszr\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.111823 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38ba9ced-55a9-40ad-8581-45f8d87da5ef-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.111915 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38ba9ced-55a9-40ad-8581-45f8d87da5ef-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.112261 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38ba9ced-55a9-40ad-8581-45f8d87da5ef-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.128419 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38ba9ced-55a9-40ad-8581-45f8d87da5ef-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.129186 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38ba9ced-55a9-40ad-8581-45f8d87da5ef-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.129320 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38ba9ced-55a9-40ad-8581-45f8d87da5ef-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.129411 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38ba9ced-55a9-40ad-8581-45f8d87da5ef-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.138335 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38ba9ced-55a9-40ad-8581-45f8d87da5ef-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.149710 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtszr\" (UniqueName: \"kubernetes.io/projected/38ba9ced-55a9-40ad-8581-45f8d87da5ef-kube-api-access-jtszr\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.163911 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38ba9ced-55a9-40ad-8581-45f8d87da5ef\") " pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.208445 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.692495 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 11 05:36:21 crc kubenswrapper[4628]: W1211 05:36:21.693898 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38ba9ced_55a9_40ad_8581_45f8d87da5ef.slice/crio-c309e9711d9e610c8190b68994e564a7a1d35697a1edf7f9cfe5198a09249481 WatchSource:0}: Error finding container c309e9711d9e610c8190b68994e564a7a1d35697a1edf7f9cfe5198a09249481: Status 404 returned error can't find the container with id c309e9711d9e610c8190b68994e564a7a1d35697a1edf7f9cfe5198a09249481 Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.807016 4628 generic.go:334] "Generic (PLEG): container finished" podID="86b06a41-e8a1-4815-b7f1-bcf104c0eebb" containerID="4ae1f682f48afb991a9e4aa8e5333146ab6975c7a206190f678e785814ac16aa" exitCode=0 Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.807156 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" event={"ID":"86b06a41-e8a1-4815-b7f1-bcf104c0eebb","Type":"ContainerDied","Data":"4ae1f682f48afb991a9e4aa8e5333146ab6975c7a206190f678e785814ac16aa"} Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.811175 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38ba9ced-55a9-40ad-8581-45f8d87da5ef","Type":"ContainerStarted","Data":"c309e9711d9e610c8190b68994e564a7a1d35697a1edf7f9cfe5198a09249481"} Dec 11 05:36:21 crc kubenswrapper[4628]: I1211 05:36:21.901904 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a07218df-1f25-47c4-89dc-2c7ce7f406ac" path="/var/lib/kubelet/pods/a07218df-1f25-47c4-89dc-2c7ce7f406ac/volumes" Dec 11 05:36:22 crc kubenswrapper[4628]: I1211 05:36:22.839542 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" event={"ID":"86b06a41-e8a1-4815-b7f1-bcf104c0eebb","Type":"ContainerStarted","Data":"88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee"} Dec 11 05:36:22 crc kubenswrapper[4628]: I1211 05:36:22.839823 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:22 crc kubenswrapper[4628]: I1211 05:36:22.842242 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3c89e316-b7b8-4740-aa49-0c21052a51de","Type":"ContainerStarted","Data":"76b869f80704a89d024a851b6b4fe574d7cd38bd87dd65c5ea584defef35d9e2"} Dec 11 05:36:22 crc kubenswrapper[4628]: I1211 05:36:22.873470 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" podStartSLOduration=3.873452935 podStartE2EDuration="3.873452935s" podCreationTimestamp="2025-12-11 05:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:36:22.861031657 +0000 UTC m=+1285.278378375" watchObservedRunningTime="2025-12-11 05:36:22.873452935 +0000 UTC m=+1285.290799643" Dec 11 05:36:23 crc kubenswrapper[4628]: I1211 05:36:23.855367 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38ba9ced-55a9-40ad-8581-45f8d87da5ef","Type":"ContainerStarted","Data":"47e1b7442246c76bfd86bdd74547a71a358d5a86be6246eaec4a385d4362551c"} Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.290134 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.399301 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-pwlzk"] Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.399830 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" podUID="2f5158e3-ab0a-4ceb-af73-55994e618c50" containerName="dnsmasq-dns" containerID="cri-o://4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c" gracePeriod=10 Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.590572 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-gz9jh"] Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.594217 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.621584 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-gz9jh"] Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.714943 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.715015 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.715045 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.715094 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqtxb\" (UniqueName: \"kubernetes.io/projected/1842054b-c613-4c76-9cb8-3738bc44a946-kube-api-access-kqtxb\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.715119 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-config\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.715155 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.715179 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.819342 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.819413 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.819498 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.819554 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.819617 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.819704 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqtxb\" (UniqueName: \"kubernetes.io/projected/1842054b-c613-4c76-9cb8-3738bc44a946-kube-api-access-kqtxb\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.819737 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-config\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.820140 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-dns-svc\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.820807 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-dns-swift-storage-0\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.820811 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-config\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.821397 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-ovsdbserver-sb\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.821530 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-openstack-edpm-ipam\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.821912 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1842054b-c613-4c76-9cb8-3738bc44a946-ovsdbserver-nb\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.840353 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqtxb\" (UniqueName: \"kubernetes.io/projected/1842054b-c613-4c76-9cb8-3738bc44a946-kube-api-access-kqtxb\") pod \"dnsmasq-dns-54ffdb7d8c-gz9jh\" (UID: \"1842054b-c613-4c76-9cb8-3738bc44a946\") " pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.913686 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.927508 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.950092 4628 generic.go:334] "Generic (PLEG): container finished" podID="2f5158e3-ab0a-4ceb-af73-55994e618c50" containerID="4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c" exitCode=0 Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.950127 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" event={"ID":"2f5158e3-ab0a-4ceb-af73-55994e618c50","Type":"ContainerDied","Data":"4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c"} Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.950178 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" event={"ID":"2f5158e3-ab0a-4ceb-af73-55994e618c50","Type":"ContainerDied","Data":"88a1fa0f21cd76a818b512ba0fe7fa8053324a0066b1b6bffeece46b1af38660"} Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.950202 4628 scope.go:117] "RemoveContainer" containerID="4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c" Dec 11 05:36:30 crc kubenswrapper[4628]: I1211 05:36:30.950379 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-pwlzk" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.025598 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wslmq\" (UniqueName: \"kubernetes.io/projected/2f5158e3-ab0a-4ceb-af73-55994e618c50-kube-api-access-wslmq\") pod \"2f5158e3-ab0a-4ceb-af73-55994e618c50\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.025681 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-dns-swift-storage-0\") pod \"2f5158e3-ab0a-4ceb-af73-55994e618c50\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.025746 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-ovsdbserver-sb\") pod \"2f5158e3-ab0a-4ceb-af73-55994e618c50\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.025809 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-dns-svc\") pod \"2f5158e3-ab0a-4ceb-af73-55994e618c50\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.025836 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-config\") pod \"2f5158e3-ab0a-4ceb-af73-55994e618c50\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.026054 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-ovsdbserver-nb\") pod \"2f5158e3-ab0a-4ceb-af73-55994e618c50\" (UID: \"2f5158e3-ab0a-4ceb-af73-55994e618c50\") " Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.045756 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f5158e3-ab0a-4ceb-af73-55994e618c50-kube-api-access-wslmq" (OuterVolumeSpecName: "kube-api-access-wslmq") pod "2f5158e3-ab0a-4ceb-af73-55994e618c50" (UID: "2f5158e3-ab0a-4ceb-af73-55994e618c50"). InnerVolumeSpecName "kube-api-access-wslmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.079227 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f5158e3-ab0a-4ceb-af73-55994e618c50" (UID: "2f5158e3-ab0a-4ceb-af73-55994e618c50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.098233 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f5158e3-ab0a-4ceb-af73-55994e618c50" (UID: "2f5158e3-ab0a-4ceb-af73-55994e618c50"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.102509 4628 scope.go:117] "RemoveContainer" containerID="7f6644e0b0532911a00388ffc4629a0331ab96cff590ae21f17f553d471c533e" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.111212 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-config" (OuterVolumeSpecName: "config") pod "2f5158e3-ab0a-4ceb-af73-55994e618c50" (UID: "2f5158e3-ab0a-4ceb-af73-55994e618c50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.128109 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.128142 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.128151 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.128159 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wslmq\" (UniqueName: \"kubernetes.io/projected/2f5158e3-ab0a-4ceb-af73-55994e618c50-kube-api-access-wslmq\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.152395 4628 scope.go:117] "RemoveContainer" containerID="4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c" Dec 11 05:36:31 crc kubenswrapper[4628]: E1211 05:36:31.152905 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c\": container with ID starting with 4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c not found: ID does not exist" containerID="4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.152959 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c"} err="failed to get container status \"4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c\": rpc error: code = NotFound desc = could not find container \"4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c\": container with ID starting with 4a7601518f7101a3b07c2b73450be40f1441020c90a96dd3fc1ec73d6977692c not found: ID does not exist" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.152982 4628 scope.go:117] "RemoveContainer" containerID="7f6644e0b0532911a00388ffc4629a0331ab96cff590ae21f17f553d471c533e" Dec 11 05:36:31 crc kubenswrapper[4628]: E1211 05:36:31.153233 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f6644e0b0532911a00388ffc4629a0331ab96cff590ae21f17f553d471c533e\": container with ID starting with 7f6644e0b0532911a00388ffc4629a0331ab96cff590ae21f17f553d471c533e not found: ID does not exist" containerID="7f6644e0b0532911a00388ffc4629a0331ab96cff590ae21f17f553d471c533e" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.153274 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f6644e0b0532911a00388ffc4629a0331ab96cff590ae21f17f553d471c533e"} err="failed to get container status \"7f6644e0b0532911a00388ffc4629a0331ab96cff590ae21f17f553d471c533e\": rpc error: code = NotFound desc = could not find container \"7f6644e0b0532911a00388ffc4629a0331ab96cff590ae21f17f553d471c533e\": container with ID starting with 7f6644e0b0532911a00388ffc4629a0331ab96cff590ae21f17f553d471c533e not found: ID does not exist" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.157343 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2f5158e3-ab0a-4ceb-af73-55994e618c50" (UID: "2f5158e3-ab0a-4ceb-af73-55994e618c50"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.180247 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f5158e3-ab0a-4ceb-af73-55994e618c50" (UID: "2f5158e3-ab0a-4ceb-af73-55994e618c50"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.230193 4628 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.230226 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f5158e3-ab0a-4ceb-af73-55994e618c50-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.281562 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-pwlzk"] Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.288641 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-pwlzk"] Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.424777 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ffdb7d8c-gz9jh"] Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.426942 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.427003 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.901209 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f5158e3-ab0a-4ceb-af73-55994e618c50" path="/var/lib/kubelet/pods/2f5158e3-ab0a-4ceb-af73-55994e618c50/volumes" Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.959682 4628 generic.go:334] "Generic (PLEG): container finished" podID="1842054b-c613-4c76-9cb8-3738bc44a946" containerID="92522cd35d361d07f829675b71be99527915d019ee6bf6bfa93e61441ce1a77f" exitCode=0 Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.959827 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" event={"ID":"1842054b-c613-4c76-9cb8-3738bc44a946","Type":"ContainerDied","Data":"92522cd35d361d07f829675b71be99527915d019ee6bf6bfa93e61441ce1a77f"} Dec 11 05:36:31 crc kubenswrapper[4628]: I1211 05:36:31.959931 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" event={"ID":"1842054b-c613-4c76-9cb8-3738bc44a946","Type":"ContainerStarted","Data":"6932b2d910831ed009fd4842d399aad175da088e0e7e22d7c4c01a2f6284eaff"} Dec 11 05:36:32 crc kubenswrapper[4628]: I1211 05:36:32.973008 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" event={"ID":"1842054b-c613-4c76-9cb8-3738bc44a946","Type":"ContainerStarted","Data":"99b0ccf511787b46148654b2b98a9fc56fd00f30b47bfd7db5c4a391ba85caa4"} Dec 11 05:36:32 crc kubenswrapper[4628]: I1211 05:36:32.973425 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:32 crc kubenswrapper[4628]: I1211 05:36:32.991568 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" podStartSLOduration=2.9915530009999998 podStartE2EDuration="2.991553001s" podCreationTimestamp="2025-12-11 05:36:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:36:32.989236788 +0000 UTC m=+1295.406583486" watchObservedRunningTime="2025-12-11 05:36:32.991553001 +0000 UTC m=+1295.408899699" Dec 11 05:36:40 crc kubenswrapper[4628]: I1211 05:36:40.916169 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54ffdb7d8c-gz9jh" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.018293 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-6vmvj"] Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.018929 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" podUID="86b06a41-e8a1-4815-b7f1-bcf104c0eebb" containerName="dnsmasq-dns" containerID="cri-o://88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee" gracePeriod=10 Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.498069 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.571273 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-config\") pod \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.571345 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-dns-svc\") pod \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.571366 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-ovsdbserver-sb\") pod \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.571414 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-ovsdbserver-nb\") pod \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.571466 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-dns-swift-storage-0\") pod \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.571561 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8k9p\" (UniqueName: \"kubernetes.io/projected/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-kube-api-access-f8k9p\") pod \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.571596 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-openstack-edpm-ipam\") pod \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\" (UID: \"86b06a41-e8a1-4815-b7f1-bcf104c0eebb\") " Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.614039 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-kube-api-access-f8k9p" (OuterVolumeSpecName: "kube-api-access-f8k9p") pod "86b06a41-e8a1-4815-b7f1-bcf104c0eebb" (UID: "86b06a41-e8a1-4815-b7f1-bcf104c0eebb"). InnerVolumeSpecName "kube-api-access-f8k9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.673493 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8k9p\" (UniqueName: \"kubernetes.io/projected/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-kube-api-access-f8k9p\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.786612 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "86b06a41-e8a1-4815-b7f1-bcf104c0eebb" (UID: "86b06a41-e8a1-4815-b7f1-bcf104c0eebb"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.808170 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-config" (OuterVolumeSpecName: "config") pod "86b06a41-e8a1-4815-b7f1-bcf104c0eebb" (UID: "86b06a41-e8a1-4815-b7f1-bcf104c0eebb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.820379 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "86b06a41-e8a1-4815-b7f1-bcf104c0eebb" (UID: "86b06a41-e8a1-4815-b7f1-bcf104c0eebb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.828614 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "86b06a41-e8a1-4815-b7f1-bcf104c0eebb" (UID: "86b06a41-e8a1-4815-b7f1-bcf104c0eebb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.834191 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "86b06a41-e8a1-4815-b7f1-bcf104c0eebb" (UID: "86b06a41-e8a1-4815-b7f1-bcf104c0eebb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.834286 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "86b06a41-e8a1-4815-b7f1-bcf104c0eebb" (UID: "86b06a41-e8a1-4815-b7f1-bcf104c0eebb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.888231 4628 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.888265 4628 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-config\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.888274 4628 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.888283 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.888294 4628 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:41 crc kubenswrapper[4628]: I1211 05:36:41.888304 4628 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86b06a41-e8a1-4815-b7f1-bcf104c0eebb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:36:42 crc kubenswrapper[4628]: I1211 05:36:42.065722 4628 generic.go:334] "Generic (PLEG): container finished" podID="86b06a41-e8a1-4815-b7f1-bcf104c0eebb" containerID="88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee" exitCode=0 Dec 11 05:36:42 crc kubenswrapper[4628]: I1211 05:36:42.065765 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" event={"ID":"86b06a41-e8a1-4815-b7f1-bcf104c0eebb","Type":"ContainerDied","Data":"88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee"} Dec 11 05:36:42 crc kubenswrapper[4628]: I1211 05:36:42.065791 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" event={"ID":"86b06a41-e8a1-4815-b7f1-bcf104c0eebb","Type":"ContainerDied","Data":"be8463023f03eb99f60943f135da3cf6f7b5fbe51d3bf6e9d2fe57c88979277c"} Dec 11 05:36:42 crc kubenswrapper[4628]: I1211 05:36:42.065807 4628 scope.go:117] "RemoveContainer" containerID="88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee" Dec 11 05:36:42 crc kubenswrapper[4628]: I1211 05:36:42.065950 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-6vmvj" Dec 11 05:36:42 crc kubenswrapper[4628]: I1211 05:36:42.093005 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-6vmvj"] Dec 11 05:36:42 crc kubenswrapper[4628]: I1211 05:36:42.104334 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-6vmvj"] Dec 11 05:36:42 crc kubenswrapper[4628]: I1211 05:36:42.110832 4628 scope.go:117] "RemoveContainer" containerID="4ae1f682f48afb991a9e4aa8e5333146ab6975c7a206190f678e785814ac16aa" Dec 11 05:36:42 crc kubenswrapper[4628]: I1211 05:36:42.138579 4628 scope.go:117] "RemoveContainer" containerID="88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee" Dec 11 05:36:42 crc kubenswrapper[4628]: E1211 05:36:42.138978 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee\": container with ID starting with 88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee not found: ID does not exist" containerID="88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee" Dec 11 05:36:42 crc kubenswrapper[4628]: I1211 05:36:42.139025 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee"} err="failed to get container status \"88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee\": rpc error: code = NotFound desc = could not find container \"88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee\": container with ID starting with 88f0b13e663b8eb72dc02f2ef3ede6982d279da0fc45c0296f235ccdfaf8b4ee not found: ID does not exist" Dec 11 05:36:42 crc kubenswrapper[4628]: I1211 05:36:42.139061 4628 scope.go:117] "RemoveContainer" containerID="4ae1f682f48afb991a9e4aa8e5333146ab6975c7a206190f678e785814ac16aa" Dec 11 05:36:42 crc kubenswrapper[4628]: E1211 05:36:42.139808 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae1f682f48afb991a9e4aa8e5333146ab6975c7a206190f678e785814ac16aa\": container with ID starting with 4ae1f682f48afb991a9e4aa8e5333146ab6975c7a206190f678e785814ac16aa not found: ID does not exist" containerID="4ae1f682f48afb991a9e4aa8e5333146ab6975c7a206190f678e785814ac16aa" Dec 11 05:36:42 crc kubenswrapper[4628]: I1211 05:36:42.139836 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae1f682f48afb991a9e4aa8e5333146ab6975c7a206190f678e785814ac16aa"} err="failed to get container status \"4ae1f682f48afb991a9e4aa8e5333146ab6975c7a206190f678e785814ac16aa\": rpc error: code = NotFound desc = could not find container \"4ae1f682f48afb991a9e4aa8e5333146ab6975c7a206190f678e785814ac16aa\": container with ID starting with 4ae1f682f48afb991a9e4aa8e5333146ab6975c7a206190f678e785814ac16aa not found: ID does not exist" Dec 11 05:36:43 crc kubenswrapper[4628]: I1211 05:36:43.907637 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86b06a41-e8a1-4815-b7f1-bcf104c0eebb" path="/var/lib/kubelet/pods/86b06a41-e8a1-4815-b7f1-bcf104c0eebb/volumes" Dec 11 05:36:55 crc kubenswrapper[4628]: I1211 05:36:55.197473 4628 generic.go:334] "Generic (PLEG): container finished" podID="3c89e316-b7b8-4740-aa49-0c21052a51de" containerID="76b869f80704a89d024a851b6b4fe574d7cd38bd87dd65c5ea584defef35d9e2" exitCode=0 Dec 11 05:36:55 crc kubenswrapper[4628]: I1211 05:36:55.197661 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3c89e316-b7b8-4740-aa49-0c21052a51de","Type":"ContainerDied","Data":"76b869f80704a89d024a851b6b4fe574d7cd38bd87dd65c5ea584defef35d9e2"} Dec 11 05:36:56 crc kubenswrapper[4628]: I1211 05:36:56.208709 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3c89e316-b7b8-4740-aa49-0c21052a51de","Type":"ContainerStarted","Data":"2ccd130a4312a4819592eb416cd479d671aeca1f9b43f9ad2a0025e86dbb12ed"} Dec 11 05:36:56 crc kubenswrapper[4628]: I1211 05:36:56.209299 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 11 05:36:56 crc kubenswrapper[4628]: I1211 05:36:56.211151 4628 generic.go:334] "Generic (PLEG): container finished" podID="38ba9ced-55a9-40ad-8581-45f8d87da5ef" containerID="47e1b7442246c76bfd86bdd74547a71a358d5a86be6246eaec4a385d4362551c" exitCode=0 Dec 11 05:36:56 crc kubenswrapper[4628]: I1211 05:36:56.211184 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38ba9ced-55a9-40ad-8581-45f8d87da5ef","Type":"ContainerDied","Data":"47e1b7442246c76bfd86bdd74547a71a358d5a86be6246eaec4a385d4362551c"} Dec 11 05:36:56 crc kubenswrapper[4628]: I1211 05:36:56.248229 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.24820428 podStartE2EDuration="37.24820428s" podCreationTimestamp="2025-12-11 05:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:36:56.238187273 +0000 UTC m=+1318.655534001" watchObservedRunningTime="2025-12-11 05:36:56.24820428 +0000 UTC m=+1318.665551018" Dec 11 05:36:57 crc kubenswrapper[4628]: I1211 05:36:57.220901 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"38ba9ced-55a9-40ad-8581-45f8d87da5ef","Type":"ContainerStarted","Data":"a79fe1f0fb5c9f11a9dea10ea106efd60a5b0957604b5fb5d56e5d3154b9f81d"} Dec 11 05:36:57 crc kubenswrapper[4628]: I1211 05:36:57.221532 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.919778 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.91975507 podStartE2EDuration="38.91975507s" podCreationTimestamp="2025-12-11 05:36:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:36:57.255767616 +0000 UTC m=+1319.673114314" watchObservedRunningTime="2025-12-11 05:36:58.91975507 +0000 UTC m=+1321.337101768" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.937415 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55"] Dec 11 05:36:58 crc kubenswrapper[4628]: E1211 05:36:58.938185 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b06a41-e8a1-4815-b7f1-bcf104c0eebb" containerName="init" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.938203 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b06a41-e8a1-4815-b7f1-bcf104c0eebb" containerName="init" Dec 11 05:36:58 crc kubenswrapper[4628]: E1211 05:36:58.938250 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5158e3-ab0a-4ceb-af73-55994e618c50" containerName="dnsmasq-dns" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.938259 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5158e3-ab0a-4ceb-af73-55994e618c50" containerName="dnsmasq-dns" Dec 11 05:36:58 crc kubenswrapper[4628]: E1211 05:36:58.938277 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5158e3-ab0a-4ceb-af73-55994e618c50" containerName="init" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.938283 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5158e3-ab0a-4ceb-af73-55994e618c50" containerName="init" Dec 11 05:36:58 crc kubenswrapper[4628]: E1211 05:36:58.938311 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b06a41-e8a1-4815-b7f1-bcf104c0eebb" containerName="dnsmasq-dns" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.938319 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b06a41-e8a1-4815-b7f1-bcf104c0eebb" containerName="dnsmasq-dns" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.938666 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b06a41-e8a1-4815-b7f1-bcf104c0eebb" containerName="dnsmasq-dns" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.938704 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f5158e3-ab0a-4ceb-af73-55994e618c50" containerName="dnsmasq-dns" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.939516 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.943216 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.943628 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.953710 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55"] Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.991541 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:36:58 crc kubenswrapper[4628]: I1211 05:36:58.991777 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.085420 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.085473 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dddg\" (UniqueName: \"kubernetes.io/projected/67fc31e9-87aa-48c9-9888-52a10d0858dd-kube-api-access-2dddg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.085499 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.085582 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.187605 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.187700 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dddg\" (UniqueName: \"kubernetes.io/projected/67fc31e9-87aa-48c9-9888-52a10d0858dd-kube-api-access-2dddg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.187773 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.187939 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.205782 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.205862 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.205953 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.209208 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dddg\" (UniqueName: \"kubernetes.io/projected/67fc31e9-87aa-48c9-9888-52a10d0858dd-kube-api-access-2dddg\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:36:59 crc kubenswrapper[4628]: I1211 05:36:59.320663 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:37:00 crc kubenswrapper[4628]: I1211 05:37:00.014086 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55"] Dec 11 05:37:00 crc kubenswrapper[4628]: I1211 05:37:00.252196 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" event={"ID":"67fc31e9-87aa-48c9-9888-52a10d0858dd","Type":"ContainerStarted","Data":"a5212a6e56bcdb80ffecaac457ed37fd675dc62bfae4a6215f4e50db7733ef54"} Dec 11 05:37:01 crc kubenswrapper[4628]: I1211 05:37:01.426607 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:37:01 crc kubenswrapper[4628]: I1211 05:37:01.427231 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:37:10 crc kubenswrapper[4628]: I1211 05:37:10.310531 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 11 05:37:11 crc kubenswrapper[4628]: I1211 05:37:11.211791 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 11 05:37:11 crc kubenswrapper[4628]: I1211 05:37:11.401942 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" event={"ID":"67fc31e9-87aa-48c9-9888-52a10d0858dd","Type":"ContainerStarted","Data":"34f43b935da544f5ff0cb0b3b65b1f799fd7c747b9763dbd2263a3190b3d8499"} Dec 11 05:37:11 crc kubenswrapper[4628]: I1211 05:37:11.419186 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" podStartSLOduration=3.131815191 podStartE2EDuration="13.419169344s" podCreationTimestamp="2025-12-11 05:36:58 +0000 UTC" firstStartedPulling="2025-12-11 05:37:00.013014169 +0000 UTC m=+1322.430360867" lastFinishedPulling="2025-12-11 05:37:10.300368312 +0000 UTC m=+1332.717715020" observedRunningTime="2025-12-11 05:37:11.412959528 +0000 UTC m=+1333.830306246" watchObservedRunningTime="2025-12-11 05:37:11.419169344 +0000 UTC m=+1333.836516042" Dec 11 05:37:23 crc kubenswrapper[4628]: I1211 05:37:23.514738 4628 generic.go:334] "Generic (PLEG): container finished" podID="67fc31e9-87aa-48c9-9888-52a10d0858dd" containerID="34f43b935da544f5ff0cb0b3b65b1f799fd7c747b9763dbd2263a3190b3d8499" exitCode=0 Dec 11 05:37:23 crc kubenswrapper[4628]: I1211 05:37:23.514825 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" event={"ID":"67fc31e9-87aa-48c9-9888-52a10d0858dd","Type":"ContainerDied","Data":"34f43b935da544f5ff0cb0b3b65b1f799fd7c747b9763dbd2263a3190b3d8499"} Dec 11 05:37:24 crc kubenswrapper[4628]: I1211 05:37:24.916997 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.008345 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dddg\" (UniqueName: \"kubernetes.io/projected/67fc31e9-87aa-48c9-9888-52a10d0858dd-kube-api-access-2dddg\") pod \"67fc31e9-87aa-48c9-9888-52a10d0858dd\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.008573 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-ssh-key\") pod \"67fc31e9-87aa-48c9-9888-52a10d0858dd\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.008609 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-inventory\") pod \"67fc31e9-87aa-48c9-9888-52a10d0858dd\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.008703 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-repo-setup-combined-ca-bundle\") pod \"67fc31e9-87aa-48c9-9888-52a10d0858dd\" (UID: \"67fc31e9-87aa-48c9-9888-52a10d0858dd\") " Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.021511 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "67fc31e9-87aa-48c9-9888-52a10d0858dd" (UID: "67fc31e9-87aa-48c9-9888-52a10d0858dd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.021728 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67fc31e9-87aa-48c9-9888-52a10d0858dd-kube-api-access-2dddg" (OuterVolumeSpecName: "kube-api-access-2dddg") pod "67fc31e9-87aa-48c9-9888-52a10d0858dd" (UID: "67fc31e9-87aa-48c9-9888-52a10d0858dd"). InnerVolumeSpecName "kube-api-access-2dddg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.036425 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "67fc31e9-87aa-48c9-9888-52a10d0858dd" (UID: "67fc31e9-87aa-48c9-9888-52a10d0858dd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.047097 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-inventory" (OuterVolumeSpecName: "inventory") pod "67fc31e9-87aa-48c9-9888-52a10d0858dd" (UID: "67fc31e9-87aa-48c9-9888-52a10d0858dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.111823 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.111886 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.111900 4628 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67fc31e9-87aa-48c9-9888-52a10d0858dd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.111916 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dddg\" (UniqueName: \"kubernetes.io/projected/67fc31e9-87aa-48c9-9888-52a10d0858dd-kube-api-access-2dddg\") on node \"crc\" DevicePath \"\"" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.537834 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" event={"ID":"67fc31e9-87aa-48c9-9888-52a10d0858dd","Type":"ContainerDied","Data":"a5212a6e56bcdb80ffecaac457ed37fd675dc62bfae4a6215f4e50db7733ef54"} Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.537885 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5212a6e56bcdb80ffecaac457ed37fd675dc62bfae4a6215f4e50db7733ef54" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.537994 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.646872 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm"] Dec 11 05:37:25 crc kubenswrapper[4628]: E1211 05:37:25.648126 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67fc31e9-87aa-48c9-9888-52a10d0858dd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.648151 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fc31e9-87aa-48c9-9888-52a10d0858dd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.648652 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="67fc31e9-87aa-48c9-9888-52a10d0858dd" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.649696 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.664167 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.664228 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.665017 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.665261 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.695804 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm"] Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.723645 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf8e2426-3f6e-4291-b9ea-77b91670d471-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cjnm\" (UID: \"cf8e2426-3f6e-4291-b9ea-77b91670d471\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.723701 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff8m5\" (UniqueName: \"kubernetes.io/projected/cf8e2426-3f6e-4291-b9ea-77b91670d471-kube-api-access-ff8m5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cjnm\" (UID: \"cf8e2426-3f6e-4291-b9ea-77b91670d471\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.723750 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf8e2426-3f6e-4291-b9ea-77b91670d471-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cjnm\" (UID: \"cf8e2426-3f6e-4291-b9ea-77b91670d471\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.788144 4628 scope.go:117] "RemoveContainer" containerID="ecd73a76204a4451e2d66df534ea5eab739cbcf64c836b3f7c403b7b4249d88a" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.825126 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf8e2426-3f6e-4291-b9ea-77b91670d471-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cjnm\" (UID: \"cf8e2426-3f6e-4291-b9ea-77b91670d471\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.825180 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff8m5\" (UniqueName: \"kubernetes.io/projected/cf8e2426-3f6e-4291-b9ea-77b91670d471-kube-api-access-ff8m5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cjnm\" (UID: \"cf8e2426-3f6e-4291-b9ea-77b91670d471\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.825273 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf8e2426-3f6e-4291-b9ea-77b91670d471-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cjnm\" (UID: \"cf8e2426-3f6e-4291-b9ea-77b91670d471\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.830068 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf8e2426-3f6e-4291-b9ea-77b91670d471-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cjnm\" (UID: \"cf8e2426-3f6e-4291-b9ea-77b91670d471\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.835183 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf8e2426-3f6e-4291-b9ea-77b91670d471-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cjnm\" (UID: \"cf8e2426-3f6e-4291-b9ea-77b91670d471\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.843815 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff8m5\" (UniqueName: \"kubernetes.io/projected/cf8e2426-3f6e-4291-b9ea-77b91670d471-kube-api-access-ff8m5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7cjnm\" (UID: \"cf8e2426-3f6e-4291-b9ea-77b91670d471\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:25 crc kubenswrapper[4628]: I1211 05:37:25.980802 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:26 crc kubenswrapper[4628]: I1211 05:37:26.592252 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm"] Dec 11 05:37:27 crc kubenswrapper[4628]: I1211 05:37:27.559177 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" event={"ID":"cf8e2426-3f6e-4291-b9ea-77b91670d471","Type":"ContainerStarted","Data":"74c8e027a0e95d8cb4febc98de66a94d3bdba82f81e9f6f40678b1e3e826d4e7"} Dec 11 05:37:27 crc kubenswrapper[4628]: I1211 05:37:27.559469 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" event={"ID":"cf8e2426-3f6e-4291-b9ea-77b91670d471","Type":"ContainerStarted","Data":"0d0d84bc1413acc2b10db3187f46bfac41373138e76a150679aad38c1363859a"} Dec 11 05:37:27 crc kubenswrapper[4628]: I1211 05:37:27.580911 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" podStartSLOduration=2.050138558 podStartE2EDuration="2.580893072s" podCreationTimestamp="2025-12-11 05:37:25 +0000 UTC" firstStartedPulling="2025-12-11 05:37:26.605184909 +0000 UTC m=+1349.022531617" lastFinishedPulling="2025-12-11 05:37:27.135939413 +0000 UTC m=+1349.553286131" observedRunningTime="2025-12-11 05:37:27.577620544 +0000 UTC m=+1349.994967242" watchObservedRunningTime="2025-12-11 05:37:27.580893072 +0000 UTC m=+1349.998239770" Dec 11 05:37:30 crc kubenswrapper[4628]: I1211 05:37:30.593183 4628 generic.go:334] "Generic (PLEG): container finished" podID="cf8e2426-3f6e-4291-b9ea-77b91670d471" containerID="74c8e027a0e95d8cb4febc98de66a94d3bdba82f81e9f6f40678b1e3e826d4e7" exitCode=0 Dec 11 05:37:30 crc kubenswrapper[4628]: I1211 05:37:30.593270 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" event={"ID":"cf8e2426-3f6e-4291-b9ea-77b91670d471","Type":"ContainerDied","Data":"74c8e027a0e95d8cb4febc98de66a94d3bdba82f81e9f6f40678b1e3e826d4e7"} Dec 11 05:37:31 crc kubenswrapper[4628]: I1211 05:37:31.427172 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:37:31 crc kubenswrapper[4628]: I1211 05:37:31.427263 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:37:31 crc kubenswrapper[4628]: I1211 05:37:31.427327 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:37:31 crc kubenswrapper[4628]: I1211 05:37:31.428398 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5a1b35de252bf7b6d284e501103fc4953df20ee7d9a62a56c9a69ef2d0ee180"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 05:37:31 crc kubenswrapper[4628]: I1211 05:37:31.428502 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://a5a1b35de252bf7b6d284e501103fc4953df20ee7d9a62a56c9a69ef2d0ee180" gracePeriod=600 Dec 11 05:37:31 crc kubenswrapper[4628]: I1211 05:37:31.614387 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="a5a1b35de252bf7b6d284e501103fc4953df20ee7d9a62a56c9a69ef2d0ee180" exitCode=0 Dec 11 05:37:31 crc kubenswrapper[4628]: I1211 05:37:31.614655 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"a5a1b35de252bf7b6d284e501103fc4953df20ee7d9a62a56c9a69ef2d0ee180"} Dec 11 05:37:31 crc kubenswrapper[4628]: I1211 05:37:31.614705 4628 scope.go:117] "RemoveContainer" containerID="6ecc8b439306d6103b7fabe922fa79c181a14fe03fbd4c6f00b4023e3934e67c" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.024171 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.189173 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff8m5\" (UniqueName: \"kubernetes.io/projected/cf8e2426-3f6e-4291-b9ea-77b91670d471-kube-api-access-ff8m5\") pod \"cf8e2426-3f6e-4291-b9ea-77b91670d471\" (UID: \"cf8e2426-3f6e-4291-b9ea-77b91670d471\") " Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.189343 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf8e2426-3f6e-4291-b9ea-77b91670d471-ssh-key\") pod \"cf8e2426-3f6e-4291-b9ea-77b91670d471\" (UID: \"cf8e2426-3f6e-4291-b9ea-77b91670d471\") " Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.189420 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf8e2426-3f6e-4291-b9ea-77b91670d471-inventory\") pod \"cf8e2426-3f6e-4291-b9ea-77b91670d471\" (UID: \"cf8e2426-3f6e-4291-b9ea-77b91670d471\") " Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.196412 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf8e2426-3f6e-4291-b9ea-77b91670d471-kube-api-access-ff8m5" (OuterVolumeSpecName: "kube-api-access-ff8m5") pod "cf8e2426-3f6e-4291-b9ea-77b91670d471" (UID: "cf8e2426-3f6e-4291-b9ea-77b91670d471"). InnerVolumeSpecName "kube-api-access-ff8m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.220139 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf8e2426-3f6e-4291-b9ea-77b91670d471-inventory" (OuterVolumeSpecName: "inventory") pod "cf8e2426-3f6e-4291-b9ea-77b91670d471" (UID: "cf8e2426-3f6e-4291-b9ea-77b91670d471"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.221064 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf8e2426-3f6e-4291-b9ea-77b91670d471-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cf8e2426-3f6e-4291-b9ea-77b91670d471" (UID: "cf8e2426-3f6e-4291-b9ea-77b91670d471"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.291431 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cf8e2426-3f6e-4291-b9ea-77b91670d471-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.291460 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf8e2426-3f6e-4291-b9ea-77b91670d471-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.291472 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff8m5\" (UniqueName: \"kubernetes.io/projected/cf8e2426-3f6e-4291-b9ea-77b91670d471-kube-api-access-ff8m5\") on node \"crc\" DevicePath \"\"" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.628417 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02"} Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.631194 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" event={"ID":"cf8e2426-3f6e-4291-b9ea-77b91670d471","Type":"ContainerDied","Data":"0d0d84bc1413acc2b10db3187f46bfac41373138e76a150679aad38c1363859a"} Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.631234 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d0d84bc1413acc2b10db3187f46bfac41373138e76a150679aad38c1363859a" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.631255 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7cjnm" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.762000 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98"] Dec 11 05:37:32 crc kubenswrapper[4628]: E1211 05:37:32.762568 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf8e2426-3f6e-4291-b9ea-77b91670d471" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.762590 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8e2426-3f6e-4291-b9ea-77b91670d471" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.762867 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf8e2426-3f6e-4291-b9ea-77b91670d471" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.763514 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.767150 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.767447 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.767707 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.768037 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.789718 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98"] Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.905587 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjkr\" (UniqueName: \"kubernetes.io/projected/376d3aeb-b569-4e4e-847a-762ed8f12b35-kube-api-access-fjjkr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.905635 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.905687 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:32 crc kubenswrapper[4628]: I1211 05:37:32.905763 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:33 crc kubenswrapper[4628]: I1211 05:37:33.007122 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:33 crc kubenswrapper[4628]: I1211 05:37:33.007319 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:33 crc kubenswrapper[4628]: I1211 05:37:33.007339 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjkr\" (UniqueName: \"kubernetes.io/projected/376d3aeb-b569-4e4e-847a-762ed8f12b35-kube-api-access-fjjkr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:33 crc kubenswrapper[4628]: I1211 05:37:33.007368 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:33 crc kubenswrapper[4628]: I1211 05:37:33.020949 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:33 crc kubenswrapper[4628]: I1211 05:37:33.021159 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:33 crc kubenswrapper[4628]: I1211 05:37:33.022798 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:33 crc kubenswrapper[4628]: I1211 05:37:33.024304 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjkr\" (UniqueName: \"kubernetes.io/projected/376d3aeb-b569-4e4e-847a-762ed8f12b35-kube-api-access-fjjkr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:33 crc kubenswrapper[4628]: I1211 05:37:33.093079 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:37:33 crc kubenswrapper[4628]: I1211 05:37:33.671712 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98"] Dec 11 05:37:34 crc kubenswrapper[4628]: I1211 05:37:34.654796 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" event={"ID":"376d3aeb-b569-4e4e-847a-762ed8f12b35","Type":"ContainerStarted","Data":"68117566df12c9218ec0f7589712831919e57b6a95e9740e4b0ee84dfe002e61"} Dec 11 05:37:34 crc kubenswrapper[4628]: I1211 05:37:34.655408 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" event={"ID":"376d3aeb-b569-4e4e-847a-762ed8f12b35","Type":"ContainerStarted","Data":"727c33eee1c8e6c34f7c10bf7cb29f5ab56703718c3a75ce00ffa38e7677a281"} Dec 11 05:37:34 crc kubenswrapper[4628]: I1211 05:37:34.683152 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" podStartSLOduration=2.094265951 podStartE2EDuration="2.68313494s" podCreationTimestamp="2025-12-11 05:37:32 +0000 UTC" firstStartedPulling="2025-12-11 05:37:33.67050461 +0000 UTC m=+1356.087851308" lastFinishedPulling="2025-12-11 05:37:34.259373599 +0000 UTC m=+1356.676720297" observedRunningTime="2025-12-11 05:37:34.67223984 +0000 UTC m=+1357.089586538" watchObservedRunningTime="2025-12-11 05:37:34.68313494 +0000 UTC m=+1357.100481638" Dec 11 05:38:25 crc kubenswrapper[4628]: I1211 05:38:25.901485 4628 scope.go:117] "RemoveContainer" containerID="b02b72218dd222a768b93cfb0a78581959a67e389b45a4665d82c9724a3f6fca" Dec 11 05:39:31 crc kubenswrapper[4628]: I1211 05:39:31.427063 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:39:31 crc kubenswrapper[4628]: I1211 05:39:31.427879 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.535027 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4s2fv"] Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.551442 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.572470 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4s2fv"] Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.713970 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g8x6\" (UniqueName: \"kubernetes.io/projected/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-kube-api-access-4g8x6\") pod \"redhat-marketplace-4s2fv\" (UID: \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\") " pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.714674 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-utilities\") pod \"redhat-marketplace-4s2fv\" (UID: \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\") " pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.714889 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-catalog-content\") pod \"redhat-marketplace-4s2fv\" (UID: \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\") " pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.816106 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-catalog-content\") pod \"redhat-marketplace-4s2fv\" (UID: \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\") " pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.816204 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g8x6\" (UniqueName: \"kubernetes.io/projected/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-kube-api-access-4g8x6\") pod \"redhat-marketplace-4s2fv\" (UID: \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\") " pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.816251 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-utilities\") pod \"redhat-marketplace-4s2fv\" (UID: \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\") " pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.816772 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-catalog-content\") pod \"redhat-marketplace-4s2fv\" (UID: \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\") " pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.816812 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-utilities\") pod \"redhat-marketplace-4s2fv\" (UID: \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\") " pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.840517 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g8x6\" (UniqueName: \"kubernetes.io/projected/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-kube-api-access-4g8x6\") pod \"redhat-marketplace-4s2fv\" (UID: \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\") " pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:39:54 crc kubenswrapper[4628]: I1211 05:39:54.891962 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:39:55 crc kubenswrapper[4628]: I1211 05:39:55.396080 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4s2fv"] Dec 11 05:39:55 crc kubenswrapper[4628]: I1211 05:39:55.518450 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4s2fv" event={"ID":"a81b0301-e7e2-4a15-9c6e-cd50d439ef78","Type":"ContainerStarted","Data":"f6fc872b179fc9096923e683e1eba9432488fe2109977b6ce6646979b3002a31"} Dec 11 05:39:56 crc kubenswrapper[4628]: I1211 05:39:56.531596 4628 generic.go:334] "Generic (PLEG): container finished" podID="a81b0301-e7e2-4a15-9c6e-cd50d439ef78" containerID="2febe1efb2022da75ae5887bf98f44e8e8f7626b1dbeeaa40aeb31d41555fbbc" exitCode=0 Dec 11 05:39:56 crc kubenswrapper[4628]: I1211 05:39:56.531997 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4s2fv" event={"ID":"a81b0301-e7e2-4a15-9c6e-cd50d439ef78","Type":"ContainerDied","Data":"2febe1efb2022da75ae5887bf98f44e8e8f7626b1dbeeaa40aeb31d41555fbbc"} Dec 11 05:39:56 crc kubenswrapper[4628]: I1211 05:39:56.535148 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 05:39:57 crc kubenswrapper[4628]: I1211 05:39:57.547309 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4s2fv" event={"ID":"a81b0301-e7e2-4a15-9c6e-cd50d439ef78","Type":"ContainerStarted","Data":"ea02b43a77830484d2f79ba501d4e7cc7024210285070736cf24846268423e49"} Dec 11 05:39:58 crc kubenswrapper[4628]: I1211 05:39:58.556756 4628 generic.go:334] "Generic (PLEG): container finished" podID="a81b0301-e7e2-4a15-9c6e-cd50d439ef78" containerID="ea02b43a77830484d2f79ba501d4e7cc7024210285070736cf24846268423e49" exitCode=0 Dec 11 05:39:58 crc kubenswrapper[4628]: I1211 05:39:58.556806 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4s2fv" event={"ID":"a81b0301-e7e2-4a15-9c6e-cd50d439ef78","Type":"ContainerDied","Data":"ea02b43a77830484d2f79ba501d4e7cc7024210285070736cf24846268423e49"} Dec 11 05:39:59 crc kubenswrapper[4628]: I1211 05:39:59.571916 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4s2fv" event={"ID":"a81b0301-e7e2-4a15-9c6e-cd50d439ef78","Type":"ContainerStarted","Data":"75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15"} Dec 11 05:39:59 crc kubenswrapper[4628]: I1211 05:39:59.599392 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4s2fv" podStartSLOduration=3.1377962090000002 podStartE2EDuration="5.599369993s" podCreationTimestamp="2025-12-11 05:39:54 +0000 UTC" firstStartedPulling="2025-12-11 05:39:56.534870274 +0000 UTC m=+1498.952216972" lastFinishedPulling="2025-12-11 05:39:58.996444048 +0000 UTC m=+1501.413790756" observedRunningTime="2025-12-11 05:39:59.598248232 +0000 UTC m=+1502.015594950" watchObservedRunningTime="2025-12-11 05:39:59.599369993 +0000 UTC m=+1502.016716731" Dec 11 05:40:01 crc kubenswrapper[4628]: I1211 05:40:01.427235 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:40:01 crc kubenswrapper[4628]: I1211 05:40:01.427577 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:40:04 crc kubenswrapper[4628]: I1211 05:40:04.892919 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:40:04 crc kubenswrapper[4628]: I1211 05:40:04.893601 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:40:04 crc kubenswrapper[4628]: I1211 05:40:04.943795 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:40:05 crc kubenswrapper[4628]: I1211 05:40:05.709107 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:40:05 crc kubenswrapper[4628]: I1211 05:40:05.775072 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4s2fv"] Dec 11 05:40:07 crc kubenswrapper[4628]: I1211 05:40:07.657420 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4s2fv" podUID="a81b0301-e7e2-4a15-9c6e-cd50d439ef78" containerName="registry-server" containerID="cri-o://75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15" gracePeriod=2 Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.211409 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.278018 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g8x6\" (UniqueName: \"kubernetes.io/projected/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-kube-api-access-4g8x6\") pod \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\" (UID: \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\") " Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.278306 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-catalog-content\") pod \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\" (UID: \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\") " Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.278388 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-utilities\") pod \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\" (UID: \"a81b0301-e7e2-4a15-9c6e-cd50d439ef78\") " Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.278906 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-utilities" (OuterVolumeSpecName: "utilities") pod "a81b0301-e7e2-4a15-9c6e-cd50d439ef78" (UID: "a81b0301-e7e2-4a15-9c6e-cd50d439ef78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.279307 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.284935 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-kube-api-access-4g8x6" (OuterVolumeSpecName: "kube-api-access-4g8x6") pod "a81b0301-e7e2-4a15-9c6e-cd50d439ef78" (UID: "a81b0301-e7e2-4a15-9c6e-cd50d439ef78"). InnerVolumeSpecName "kube-api-access-4g8x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.328918 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a81b0301-e7e2-4a15-9c6e-cd50d439ef78" (UID: "a81b0301-e7e2-4a15-9c6e-cd50d439ef78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.380973 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.381018 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g8x6\" (UniqueName: \"kubernetes.io/projected/a81b0301-e7e2-4a15-9c6e-cd50d439ef78-kube-api-access-4g8x6\") on node \"crc\" DevicePath \"\"" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.670057 4628 generic.go:334] "Generic (PLEG): container finished" podID="a81b0301-e7e2-4a15-9c6e-cd50d439ef78" containerID="75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15" exitCode=0 Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.670104 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4s2fv" event={"ID":"a81b0301-e7e2-4a15-9c6e-cd50d439ef78","Type":"ContainerDied","Data":"75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15"} Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.670443 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4s2fv" event={"ID":"a81b0301-e7e2-4a15-9c6e-cd50d439ef78","Type":"ContainerDied","Data":"f6fc872b179fc9096923e683e1eba9432488fe2109977b6ce6646979b3002a31"} Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.670489 4628 scope.go:117] "RemoveContainer" containerID="75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.670138 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4s2fv" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.704608 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4s2fv"] Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.705598 4628 scope.go:117] "RemoveContainer" containerID="ea02b43a77830484d2f79ba501d4e7cc7024210285070736cf24846268423e49" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.713370 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4s2fv"] Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.733213 4628 scope.go:117] "RemoveContainer" containerID="2febe1efb2022da75ae5887bf98f44e8e8f7626b1dbeeaa40aeb31d41555fbbc" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.782246 4628 scope.go:117] "RemoveContainer" containerID="75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15" Dec 11 05:40:08 crc kubenswrapper[4628]: E1211 05:40:08.782610 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15\": container with ID starting with 75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15 not found: ID does not exist" containerID="75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.782668 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15"} err="failed to get container status \"75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15\": rpc error: code = NotFound desc = could not find container \"75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15\": container with ID starting with 75ff384fcd46b78b0035118e3ffeeb56fb6717fe485ea66c6cba8a87ee1b7b15 not found: ID does not exist" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.782699 4628 scope.go:117] "RemoveContainer" containerID="ea02b43a77830484d2f79ba501d4e7cc7024210285070736cf24846268423e49" Dec 11 05:40:08 crc kubenswrapper[4628]: E1211 05:40:08.782932 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea02b43a77830484d2f79ba501d4e7cc7024210285070736cf24846268423e49\": container with ID starting with ea02b43a77830484d2f79ba501d4e7cc7024210285070736cf24846268423e49 not found: ID does not exist" containerID="ea02b43a77830484d2f79ba501d4e7cc7024210285070736cf24846268423e49" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.782965 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea02b43a77830484d2f79ba501d4e7cc7024210285070736cf24846268423e49"} err="failed to get container status \"ea02b43a77830484d2f79ba501d4e7cc7024210285070736cf24846268423e49\": rpc error: code = NotFound desc = could not find container \"ea02b43a77830484d2f79ba501d4e7cc7024210285070736cf24846268423e49\": container with ID starting with ea02b43a77830484d2f79ba501d4e7cc7024210285070736cf24846268423e49 not found: ID does not exist" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.782983 4628 scope.go:117] "RemoveContainer" containerID="2febe1efb2022da75ae5887bf98f44e8e8f7626b1dbeeaa40aeb31d41555fbbc" Dec 11 05:40:08 crc kubenswrapper[4628]: E1211 05:40:08.783193 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2febe1efb2022da75ae5887bf98f44e8e8f7626b1dbeeaa40aeb31d41555fbbc\": container with ID starting with 2febe1efb2022da75ae5887bf98f44e8e8f7626b1dbeeaa40aeb31d41555fbbc not found: ID does not exist" containerID="2febe1efb2022da75ae5887bf98f44e8e8f7626b1dbeeaa40aeb31d41555fbbc" Dec 11 05:40:08 crc kubenswrapper[4628]: I1211 05:40:08.783251 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2febe1efb2022da75ae5887bf98f44e8e8f7626b1dbeeaa40aeb31d41555fbbc"} err="failed to get container status \"2febe1efb2022da75ae5887bf98f44e8e8f7626b1dbeeaa40aeb31d41555fbbc\": rpc error: code = NotFound desc = could not find container \"2febe1efb2022da75ae5887bf98f44e8e8f7626b1dbeeaa40aeb31d41555fbbc\": container with ID starting with 2febe1efb2022da75ae5887bf98f44e8e8f7626b1dbeeaa40aeb31d41555fbbc not found: ID does not exist" Dec 11 05:40:09 crc kubenswrapper[4628]: I1211 05:40:09.904367 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81b0301-e7e2-4a15-9c6e-cd50d439ef78" path="/var/lib/kubelet/pods/a81b0301-e7e2-4a15-9c6e-cd50d439ef78/volumes" Dec 11 05:40:31 crc kubenswrapper[4628]: I1211 05:40:31.427322 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:40:31 crc kubenswrapper[4628]: I1211 05:40:31.427885 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:40:31 crc kubenswrapper[4628]: I1211 05:40:31.427935 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:40:31 crc kubenswrapper[4628]: I1211 05:40:31.428729 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 05:40:31 crc kubenswrapper[4628]: I1211 05:40:31.428800 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" gracePeriod=600 Dec 11 05:40:31 crc kubenswrapper[4628]: E1211 05:40:31.554459 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:40:31 crc kubenswrapper[4628]: I1211 05:40:31.971866 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" exitCode=0 Dec 11 05:40:31 crc kubenswrapper[4628]: I1211 05:40:31.971876 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02"} Dec 11 05:40:31 crc kubenswrapper[4628]: I1211 05:40:31.971975 4628 scope.go:117] "RemoveContainer" containerID="a5a1b35de252bf7b6d284e501103fc4953df20ee7d9a62a56c9a69ef2d0ee180" Dec 11 05:40:31 crc kubenswrapper[4628]: I1211 05:40:31.972820 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:40:31 crc kubenswrapper[4628]: E1211 05:40:31.973328 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.232869 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mdgnq"] Dec 11 05:40:32 crc kubenswrapper[4628]: E1211 05:40:32.233698 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81b0301-e7e2-4a15-9c6e-cd50d439ef78" containerName="extract-utilities" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.233762 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81b0301-e7e2-4a15-9c6e-cd50d439ef78" containerName="extract-utilities" Dec 11 05:40:32 crc kubenswrapper[4628]: E1211 05:40:32.233887 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81b0301-e7e2-4a15-9c6e-cd50d439ef78" containerName="registry-server" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.234141 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81b0301-e7e2-4a15-9c6e-cd50d439ef78" containerName="registry-server" Dec 11 05:40:32 crc kubenswrapper[4628]: E1211 05:40:32.234194 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81b0301-e7e2-4a15-9c6e-cd50d439ef78" containerName="extract-content" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.234241 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81b0301-e7e2-4a15-9c6e-cd50d439ef78" containerName="extract-content" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.234716 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81b0301-e7e2-4a15-9c6e-cd50d439ef78" containerName="registry-server" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.236375 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.246164 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mdgnq"] Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.391623 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chr8h\" (UniqueName: \"kubernetes.io/projected/a8af5a90-5182-4e1c-83bb-f792c85412de-kube-api-access-chr8h\") pod \"certified-operators-mdgnq\" (UID: \"a8af5a90-5182-4e1c-83bb-f792c85412de\") " pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.391938 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8af5a90-5182-4e1c-83bb-f792c85412de-utilities\") pod \"certified-operators-mdgnq\" (UID: \"a8af5a90-5182-4e1c-83bb-f792c85412de\") " pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.392074 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8af5a90-5182-4e1c-83bb-f792c85412de-catalog-content\") pod \"certified-operators-mdgnq\" (UID: \"a8af5a90-5182-4e1c-83bb-f792c85412de\") " pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.493481 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8af5a90-5182-4e1c-83bb-f792c85412de-utilities\") pod \"certified-operators-mdgnq\" (UID: \"a8af5a90-5182-4e1c-83bb-f792c85412de\") " pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.493563 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8af5a90-5182-4e1c-83bb-f792c85412de-catalog-content\") pod \"certified-operators-mdgnq\" (UID: \"a8af5a90-5182-4e1c-83bb-f792c85412de\") " pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.493678 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chr8h\" (UniqueName: \"kubernetes.io/projected/a8af5a90-5182-4e1c-83bb-f792c85412de-kube-api-access-chr8h\") pod \"certified-operators-mdgnq\" (UID: \"a8af5a90-5182-4e1c-83bb-f792c85412de\") " pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.493977 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8af5a90-5182-4e1c-83bb-f792c85412de-utilities\") pod \"certified-operators-mdgnq\" (UID: \"a8af5a90-5182-4e1c-83bb-f792c85412de\") " pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.495649 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8af5a90-5182-4e1c-83bb-f792c85412de-catalog-content\") pod \"certified-operators-mdgnq\" (UID: \"a8af5a90-5182-4e1c-83bb-f792c85412de\") " pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.517608 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chr8h\" (UniqueName: \"kubernetes.io/projected/a8af5a90-5182-4e1c-83bb-f792c85412de-kube-api-access-chr8h\") pod \"certified-operators-mdgnq\" (UID: \"a8af5a90-5182-4e1c-83bb-f792c85412de\") " pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:32 crc kubenswrapper[4628]: I1211 05:40:32.579087 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:33 crc kubenswrapper[4628]: I1211 05:40:33.155326 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mdgnq"] Dec 11 05:40:34 crc kubenswrapper[4628]: I1211 05:40:34.002214 4628 generic.go:334] "Generic (PLEG): container finished" podID="a8af5a90-5182-4e1c-83bb-f792c85412de" containerID="c5cebd074f2d8b0fee775e24787bd3cc4905c7aa1a844e194b245a58739ed464" exitCode=0 Dec 11 05:40:34 crc kubenswrapper[4628]: I1211 05:40:34.002289 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdgnq" event={"ID":"a8af5a90-5182-4e1c-83bb-f792c85412de","Type":"ContainerDied","Data":"c5cebd074f2d8b0fee775e24787bd3cc4905c7aa1a844e194b245a58739ed464"} Dec 11 05:40:34 crc kubenswrapper[4628]: I1211 05:40:34.002604 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdgnq" event={"ID":"a8af5a90-5182-4e1c-83bb-f792c85412de","Type":"ContainerStarted","Data":"de8836d62b3ebc30ed07aff3a0683a6dea599968915213d78b69b4a286799b5d"} Dec 11 05:40:35 crc kubenswrapper[4628]: I1211 05:40:35.011923 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdgnq" event={"ID":"a8af5a90-5182-4e1c-83bb-f792c85412de","Type":"ContainerStarted","Data":"b244f61f984278e2fbb5ca8e1d46d0a29ce148880a702654bf3eadba92725302"} Dec 11 05:40:36 crc kubenswrapper[4628]: I1211 05:40:36.020882 4628 generic.go:334] "Generic (PLEG): container finished" podID="a8af5a90-5182-4e1c-83bb-f792c85412de" containerID="b244f61f984278e2fbb5ca8e1d46d0a29ce148880a702654bf3eadba92725302" exitCode=0 Dec 11 05:40:36 crc kubenswrapper[4628]: I1211 05:40:36.020948 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdgnq" event={"ID":"a8af5a90-5182-4e1c-83bb-f792c85412de","Type":"ContainerDied","Data":"b244f61f984278e2fbb5ca8e1d46d0a29ce148880a702654bf3eadba92725302"} Dec 11 05:40:37 crc kubenswrapper[4628]: I1211 05:40:37.034429 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdgnq" event={"ID":"a8af5a90-5182-4e1c-83bb-f792c85412de","Type":"ContainerStarted","Data":"ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9"} Dec 11 05:40:38 crc kubenswrapper[4628]: I1211 05:40:38.064878 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mdgnq" podStartSLOduration=3.566008149 podStartE2EDuration="6.064861121s" podCreationTimestamp="2025-12-11 05:40:32 +0000 UTC" firstStartedPulling="2025-12-11 05:40:34.004651793 +0000 UTC m=+1536.421998491" lastFinishedPulling="2025-12-11 05:40:36.503504755 +0000 UTC m=+1538.920851463" observedRunningTime="2025-12-11 05:40:38.057070541 +0000 UTC m=+1540.474417239" watchObservedRunningTime="2025-12-11 05:40:38.064861121 +0000 UTC m=+1540.482207809" Dec 11 05:40:42 crc kubenswrapper[4628]: I1211 05:40:42.579501 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:42 crc kubenswrapper[4628]: I1211 05:40:42.580127 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:42 crc kubenswrapper[4628]: I1211 05:40:42.639074 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:43 crc kubenswrapper[4628]: I1211 05:40:43.132941 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:43 crc kubenswrapper[4628]: I1211 05:40:43.191309 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mdgnq"] Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.110639 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mdgnq" podUID="a8af5a90-5182-4e1c-83bb-f792c85412de" containerName="registry-server" containerID="cri-o://ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9" gracePeriod=2 Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.298195 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fqhh6"] Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.300359 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.350013 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fqhh6"] Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.454511 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lncn\" (UniqueName: \"kubernetes.io/projected/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-kube-api-access-4lncn\") pod \"community-operators-fqhh6\" (UID: \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\") " pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.455046 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-catalog-content\") pod \"community-operators-fqhh6\" (UID: \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\") " pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.455102 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-utilities\") pod \"community-operators-fqhh6\" (UID: \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\") " pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.556299 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lncn\" (UniqueName: \"kubernetes.io/projected/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-kube-api-access-4lncn\") pod \"community-operators-fqhh6\" (UID: \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\") " pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.556455 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-catalog-content\") pod \"community-operators-fqhh6\" (UID: \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\") " pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.556520 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-utilities\") pod \"community-operators-fqhh6\" (UID: \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\") " pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.556978 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-catalog-content\") pod \"community-operators-fqhh6\" (UID: \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\") " pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.557117 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-utilities\") pod \"community-operators-fqhh6\" (UID: \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\") " pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.578500 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lncn\" (UniqueName: \"kubernetes.io/projected/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-kube-api-access-4lncn\") pod \"community-operators-fqhh6\" (UID: \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\") " pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.617536 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.642400 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.760474 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chr8h\" (UniqueName: \"kubernetes.io/projected/a8af5a90-5182-4e1c-83bb-f792c85412de-kube-api-access-chr8h\") pod \"a8af5a90-5182-4e1c-83bb-f792c85412de\" (UID: \"a8af5a90-5182-4e1c-83bb-f792c85412de\") " Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.760816 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8af5a90-5182-4e1c-83bb-f792c85412de-utilities\") pod \"a8af5a90-5182-4e1c-83bb-f792c85412de\" (UID: \"a8af5a90-5182-4e1c-83bb-f792c85412de\") " Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.760862 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8af5a90-5182-4e1c-83bb-f792c85412de-catalog-content\") pod \"a8af5a90-5182-4e1c-83bb-f792c85412de\" (UID: \"a8af5a90-5182-4e1c-83bb-f792c85412de\") " Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.762744 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8af5a90-5182-4e1c-83bb-f792c85412de-utilities" (OuterVolumeSpecName: "utilities") pod "a8af5a90-5182-4e1c-83bb-f792c85412de" (UID: "a8af5a90-5182-4e1c-83bb-f792c85412de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.765926 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8af5a90-5182-4e1c-83bb-f792c85412de-kube-api-access-chr8h" (OuterVolumeSpecName: "kube-api-access-chr8h") pod "a8af5a90-5182-4e1c-83bb-f792c85412de" (UID: "a8af5a90-5182-4e1c-83bb-f792c85412de"). InnerVolumeSpecName "kube-api-access-chr8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.837266 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8af5a90-5182-4e1c-83bb-f792c85412de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8af5a90-5182-4e1c-83bb-f792c85412de" (UID: "a8af5a90-5182-4e1c-83bb-f792c85412de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.863931 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8af5a90-5182-4e1c-83bb-f792c85412de-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.863960 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8af5a90-5182-4e1c-83bb-f792c85412de-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:40:45 crc kubenswrapper[4628]: I1211 05:40:45.863970 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chr8h\" (UniqueName: \"kubernetes.io/projected/a8af5a90-5182-4e1c-83bb-f792c85412de-kube-api-access-chr8h\") on node \"crc\" DevicePath \"\"" Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.002199 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fqhh6"] Dec 11 05:40:46 crc kubenswrapper[4628]: W1211 05:40:46.009101 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c4ac6d7_2a29_4b6d_bba1_7d3f6ffbd50d.slice/crio-88feeb6c95003054f839f4103409b6827082d1cdfbbd2e72c07b00e49ea9b5f8 WatchSource:0}: Error finding container 88feeb6c95003054f839f4103409b6827082d1cdfbbd2e72c07b00e49ea9b5f8: Status 404 returned error can't find the container with id 88feeb6c95003054f839f4103409b6827082d1cdfbbd2e72c07b00e49ea9b5f8 Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.119072 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqhh6" event={"ID":"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d","Type":"ContainerStarted","Data":"88feeb6c95003054f839f4103409b6827082d1cdfbbd2e72c07b00e49ea9b5f8"} Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.121751 4628 generic.go:334] "Generic (PLEG): container finished" podID="a8af5a90-5182-4e1c-83bb-f792c85412de" containerID="ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9" exitCode=0 Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.121774 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdgnq" event={"ID":"a8af5a90-5182-4e1c-83bb-f792c85412de","Type":"ContainerDied","Data":"ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9"} Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.121802 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdgnq" event={"ID":"a8af5a90-5182-4e1c-83bb-f792c85412de","Type":"ContainerDied","Data":"de8836d62b3ebc30ed07aff3a0683a6dea599968915213d78b69b4a286799b5d"} Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.121820 4628 scope.go:117] "RemoveContainer" containerID="ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9" Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.121833 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdgnq" Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.160903 4628 scope.go:117] "RemoveContainer" containerID="b244f61f984278e2fbb5ca8e1d46d0a29ce148880a702654bf3eadba92725302" Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.167921 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mdgnq"] Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.178074 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mdgnq"] Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.187680 4628 scope.go:117] "RemoveContainer" containerID="c5cebd074f2d8b0fee775e24787bd3cc4905c7aa1a844e194b245a58739ed464" Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.214012 4628 scope.go:117] "RemoveContainer" containerID="ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9" Dec 11 05:40:46 crc kubenswrapper[4628]: E1211 05:40:46.216169 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9\": container with ID starting with ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9 not found: ID does not exist" containerID="ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9" Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.216199 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9"} err="failed to get container status \"ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9\": rpc error: code = NotFound desc = could not find container \"ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9\": container with ID starting with ba4325286311949f2e9d58735b0a701ef0675dc503214a040420f744bef551e9 not found: ID does not exist" Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.216224 4628 scope.go:117] "RemoveContainer" containerID="b244f61f984278e2fbb5ca8e1d46d0a29ce148880a702654bf3eadba92725302" Dec 11 05:40:46 crc kubenswrapper[4628]: E1211 05:40:46.216545 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b244f61f984278e2fbb5ca8e1d46d0a29ce148880a702654bf3eadba92725302\": container with ID starting with b244f61f984278e2fbb5ca8e1d46d0a29ce148880a702654bf3eadba92725302 not found: ID does not exist" containerID="b244f61f984278e2fbb5ca8e1d46d0a29ce148880a702654bf3eadba92725302" Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.216564 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b244f61f984278e2fbb5ca8e1d46d0a29ce148880a702654bf3eadba92725302"} err="failed to get container status \"b244f61f984278e2fbb5ca8e1d46d0a29ce148880a702654bf3eadba92725302\": rpc error: code = NotFound desc = could not find container \"b244f61f984278e2fbb5ca8e1d46d0a29ce148880a702654bf3eadba92725302\": container with ID starting with b244f61f984278e2fbb5ca8e1d46d0a29ce148880a702654bf3eadba92725302 not found: ID does not exist" Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.216576 4628 scope.go:117] "RemoveContainer" containerID="c5cebd074f2d8b0fee775e24787bd3cc4905c7aa1a844e194b245a58739ed464" Dec 11 05:40:46 crc kubenswrapper[4628]: E1211 05:40:46.216798 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5cebd074f2d8b0fee775e24787bd3cc4905c7aa1a844e194b245a58739ed464\": container with ID starting with c5cebd074f2d8b0fee775e24787bd3cc4905c7aa1a844e194b245a58739ed464 not found: ID does not exist" containerID="c5cebd074f2d8b0fee775e24787bd3cc4905c7aa1a844e194b245a58739ed464" Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.216819 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5cebd074f2d8b0fee775e24787bd3cc4905c7aa1a844e194b245a58739ed464"} err="failed to get container status \"c5cebd074f2d8b0fee775e24787bd3cc4905c7aa1a844e194b245a58739ed464\": rpc error: code = NotFound desc = could not find container \"c5cebd074f2d8b0fee775e24787bd3cc4905c7aa1a844e194b245a58739ed464\": container with ID starting with c5cebd074f2d8b0fee775e24787bd3cc4905c7aa1a844e194b245a58739ed464 not found: ID does not exist" Dec 11 05:40:46 crc kubenswrapper[4628]: I1211 05:40:46.889397 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:40:46 crc kubenswrapper[4628]: E1211 05:40:46.890199 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:40:47 crc kubenswrapper[4628]: I1211 05:40:47.133380 4628 generic.go:334] "Generic (PLEG): container finished" podID="9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" containerID="66a1dd10d0b93a544b97267cc3367f65be12212d66dc6ade37886b6cb8a2ddaf" exitCode=0 Dec 11 05:40:47 crc kubenswrapper[4628]: I1211 05:40:47.133462 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqhh6" event={"ID":"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d","Type":"ContainerDied","Data":"66a1dd10d0b93a544b97267cc3367f65be12212d66dc6ade37886b6cb8a2ddaf"} Dec 11 05:40:47 crc kubenswrapper[4628]: I1211 05:40:47.905142 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8af5a90-5182-4e1c-83bb-f792c85412de" path="/var/lib/kubelet/pods/a8af5a90-5182-4e1c-83bb-f792c85412de/volumes" Dec 11 05:40:48 crc kubenswrapper[4628]: I1211 05:40:48.145363 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqhh6" event={"ID":"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d","Type":"ContainerStarted","Data":"81ad422ff17058e11b7ede50c9535b46a9d5e9d76adb682fd5d0b968d537ead0"} Dec 11 05:40:49 crc kubenswrapper[4628]: I1211 05:40:49.163040 4628 generic.go:334] "Generic (PLEG): container finished" podID="9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" containerID="81ad422ff17058e11b7ede50c9535b46a9d5e9d76adb682fd5d0b968d537ead0" exitCode=0 Dec 11 05:40:49 crc kubenswrapper[4628]: I1211 05:40:49.163361 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqhh6" event={"ID":"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d","Type":"ContainerDied","Data":"81ad422ff17058e11b7ede50c9535b46a9d5e9d76adb682fd5d0b968d537ead0"} Dec 11 05:40:51 crc kubenswrapper[4628]: I1211 05:40:51.184313 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqhh6" event={"ID":"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d","Type":"ContainerStarted","Data":"5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4"} Dec 11 05:40:51 crc kubenswrapper[4628]: I1211 05:40:51.212148 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fqhh6" podStartSLOduration=2.539309332 podStartE2EDuration="6.21212503s" podCreationTimestamp="2025-12-11 05:40:45 +0000 UTC" firstStartedPulling="2025-12-11 05:40:47.136898816 +0000 UTC m=+1549.554245514" lastFinishedPulling="2025-12-11 05:40:50.809714504 +0000 UTC m=+1553.227061212" observedRunningTime="2025-12-11 05:40:51.205731637 +0000 UTC m=+1553.623078335" watchObservedRunningTime="2025-12-11 05:40:51.21212503 +0000 UTC m=+1553.629471738" Dec 11 05:40:55 crc kubenswrapper[4628]: I1211 05:40:55.618263 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:55 crc kubenswrapper[4628]: I1211 05:40:55.620002 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:55 crc kubenswrapper[4628]: I1211 05:40:55.677867 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:56 crc kubenswrapper[4628]: I1211 05:40:56.436521 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:56 crc kubenswrapper[4628]: I1211 05:40:56.505466 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fqhh6"] Dec 11 05:40:58 crc kubenswrapper[4628]: I1211 05:40:58.373575 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fqhh6" podUID="9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" containerName="registry-server" containerID="cri-o://5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4" gracePeriod=2 Dec 11 05:40:58 crc kubenswrapper[4628]: I1211 05:40:58.840019 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:58 crc kubenswrapper[4628]: I1211 05:40:58.926419 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-utilities\") pod \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\" (UID: \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\") " Dec 11 05:40:58 crc kubenswrapper[4628]: I1211 05:40:58.926464 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-catalog-content\") pod \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\" (UID: \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\") " Dec 11 05:40:58 crc kubenswrapper[4628]: I1211 05:40:58.926521 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lncn\" (UniqueName: \"kubernetes.io/projected/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-kube-api-access-4lncn\") pod \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\" (UID: \"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d\") " Dec 11 05:40:58 crc kubenswrapper[4628]: I1211 05:40:58.927328 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-utilities" (OuterVolumeSpecName: "utilities") pod "9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" (UID: "9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:40:58 crc kubenswrapper[4628]: I1211 05:40:58.934809 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-kube-api-access-4lncn" (OuterVolumeSpecName: "kube-api-access-4lncn") pod "9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" (UID: "9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d"). InnerVolumeSpecName "kube-api-access-4lncn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:40:58 crc kubenswrapper[4628]: I1211 05:40:58.978892 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" (UID: "9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.028584 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.028617 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.028629 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lncn\" (UniqueName: \"kubernetes.io/projected/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d-kube-api-access-4lncn\") on node \"crc\" DevicePath \"\"" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.390362 4628 generic.go:334] "Generic (PLEG): container finished" podID="9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" containerID="5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4" exitCode=0 Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.390441 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqhh6" event={"ID":"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d","Type":"ContainerDied","Data":"5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4"} Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.390497 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fqhh6" event={"ID":"9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d","Type":"ContainerDied","Data":"88feeb6c95003054f839f4103409b6827082d1cdfbbd2e72c07b00e49ea9b5f8"} Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.390530 4628 scope.go:117] "RemoveContainer" containerID="5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.392313 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fqhh6" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.424260 4628 scope.go:117] "RemoveContainer" containerID="81ad422ff17058e11b7ede50c9535b46a9d5e9d76adb682fd5d0b968d537ead0" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.451751 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fqhh6"] Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.467344 4628 scope.go:117] "RemoveContainer" containerID="66a1dd10d0b93a544b97267cc3367f65be12212d66dc6ade37886b6cb8a2ddaf" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.484704 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fqhh6"] Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.523164 4628 scope.go:117] "RemoveContainer" containerID="5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4" Dec 11 05:40:59 crc kubenswrapper[4628]: E1211 05:40:59.523648 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4\": container with ID starting with 5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4 not found: ID does not exist" containerID="5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.523687 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4"} err="failed to get container status \"5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4\": rpc error: code = NotFound desc = could not find container \"5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4\": container with ID starting with 5ba7d83b4c8b0dd06ffb20e62e9cbe2d138176376db8fb30818bc4e2562aa2a4 not found: ID does not exist" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.523742 4628 scope.go:117] "RemoveContainer" containerID="81ad422ff17058e11b7ede50c9535b46a9d5e9d76adb682fd5d0b968d537ead0" Dec 11 05:40:59 crc kubenswrapper[4628]: E1211 05:40:59.524057 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ad422ff17058e11b7ede50c9535b46a9d5e9d76adb682fd5d0b968d537ead0\": container with ID starting with 81ad422ff17058e11b7ede50c9535b46a9d5e9d76adb682fd5d0b968d537ead0 not found: ID does not exist" containerID="81ad422ff17058e11b7ede50c9535b46a9d5e9d76adb682fd5d0b968d537ead0" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.524077 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ad422ff17058e11b7ede50c9535b46a9d5e9d76adb682fd5d0b968d537ead0"} err="failed to get container status \"81ad422ff17058e11b7ede50c9535b46a9d5e9d76adb682fd5d0b968d537ead0\": rpc error: code = NotFound desc = could not find container \"81ad422ff17058e11b7ede50c9535b46a9d5e9d76adb682fd5d0b968d537ead0\": container with ID starting with 81ad422ff17058e11b7ede50c9535b46a9d5e9d76adb682fd5d0b968d537ead0 not found: ID does not exist" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.524091 4628 scope.go:117] "RemoveContainer" containerID="66a1dd10d0b93a544b97267cc3367f65be12212d66dc6ade37886b6cb8a2ddaf" Dec 11 05:40:59 crc kubenswrapper[4628]: E1211 05:40:59.524333 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a1dd10d0b93a544b97267cc3367f65be12212d66dc6ade37886b6cb8a2ddaf\": container with ID starting with 66a1dd10d0b93a544b97267cc3367f65be12212d66dc6ade37886b6cb8a2ddaf not found: ID does not exist" containerID="66a1dd10d0b93a544b97267cc3367f65be12212d66dc6ade37886b6cb8a2ddaf" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.524359 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a1dd10d0b93a544b97267cc3367f65be12212d66dc6ade37886b6cb8a2ddaf"} err="failed to get container status \"66a1dd10d0b93a544b97267cc3367f65be12212d66dc6ade37886b6cb8a2ddaf\": rpc error: code = NotFound desc = could not find container \"66a1dd10d0b93a544b97267cc3367f65be12212d66dc6ade37886b6cb8a2ddaf\": container with ID starting with 66a1dd10d0b93a544b97267cc3367f65be12212d66dc6ade37886b6cb8a2ddaf not found: ID does not exist" Dec 11 05:40:59 crc kubenswrapper[4628]: I1211 05:40:59.906532 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" path="/var/lib/kubelet/pods/9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d/volumes" Dec 11 05:41:01 crc kubenswrapper[4628]: I1211 05:41:01.889446 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:41:01 crc kubenswrapper[4628]: E1211 05:41:01.890035 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:41:04 crc kubenswrapper[4628]: I1211 05:41:04.069291 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2a5a-account-create-update-qmvjk"] Dec 11 05:41:04 crc kubenswrapper[4628]: I1211 05:41:04.085364 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c49b-account-create-update-d7hdm"] Dec 11 05:41:04 crc kubenswrapper[4628]: I1211 05:41:04.096997 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-k98qz"] Dec 11 05:41:04 crc kubenswrapper[4628]: I1211 05:41:04.110356 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-24c8x"] Dec 11 05:41:04 crc kubenswrapper[4628]: I1211 05:41:04.118270 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-mzcb5"] Dec 11 05:41:04 crc kubenswrapper[4628]: I1211 05:41:04.130608 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ac9c-account-create-update-9jpmx"] Dec 11 05:41:04 crc kubenswrapper[4628]: I1211 05:41:04.141518 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-24c8x"] Dec 11 05:41:04 crc kubenswrapper[4628]: I1211 05:41:04.152361 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-mzcb5"] Dec 11 05:41:04 crc kubenswrapper[4628]: I1211 05:41:04.160986 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c49b-account-create-update-d7hdm"] Dec 11 05:41:04 crc kubenswrapper[4628]: I1211 05:41:04.167473 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-k98qz"] Dec 11 05:41:04 crc kubenswrapper[4628]: I1211 05:41:04.174316 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2a5a-account-create-update-qmvjk"] Dec 11 05:41:04 crc kubenswrapper[4628]: I1211 05:41:04.181890 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ac9c-account-create-update-9jpmx"] Dec 11 05:41:05 crc kubenswrapper[4628]: I1211 05:41:05.906219 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08120cb0-d26e-4520-a972-829bde3491dc" path="/var/lib/kubelet/pods/08120cb0-d26e-4520-a972-829bde3491dc/volumes" Dec 11 05:41:05 crc kubenswrapper[4628]: I1211 05:41:05.907828 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1318bf7e-ab46-425f-b121-4423d0623af6" path="/var/lib/kubelet/pods/1318bf7e-ab46-425f-b121-4423d0623af6/volumes" Dec 11 05:41:05 crc kubenswrapper[4628]: I1211 05:41:05.909345 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="300bbc9f-12f3-4da2-bb3f-85a458b574cf" path="/var/lib/kubelet/pods/300bbc9f-12f3-4da2-bb3f-85a458b574cf/volumes" Dec 11 05:41:05 crc kubenswrapper[4628]: I1211 05:41:05.910476 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f" path="/var/lib/kubelet/pods/6ed312d5-c8ce-42d7-90bd-49e3ef4f5b6f/volumes" Dec 11 05:41:05 crc kubenswrapper[4628]: I1211 05:41:05.912767 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41b81d7-fa6c-4daa-b06d-df1105c0e566" path="/var/lib/kubelet/pods/a41b81d7-fa6c-4daa-b06d-df1105c0e566/volumes" Dec 11 05:41:05 crc kubenswrapper[4628]: I1211 05:41:05.914129 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ede4d9-f3c1-4e13-948d-abd83adb1397" path="/var/lib/kubelet/pods/f1ede4d9-f3c1-4e13-948d-abd83adb1397/volumes" Dec 11 05:41:07 crc kubenswrapper[4628]: I1211 05:41:07.472312 4628 generic.go:334] "Generic (PLEG): container finished" podID="376d3aeb-b569-4e4e-847a-762ed8f12b35" containerID="68117566df12c9218ec0f7589712831919e57b6a95e9740e4b0ee84dfe002e61" exitCode=0 Dec 11 05:41:07 crc kubenswrapper[4628]: I1211 05:41:07.472356 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" event={"ID":"376d3aeb-b569-4e4e-847a-762ed8f12b35","Type":"ContainerDied","Data":"68117566df12c9218ec0f7589712831919e57b6a95e9740e4b0ee84dfe002e61"} Dec 11 05:41:08 crc kubenswrapper[4628]: I1211 05:41:08.915735 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.017176 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-inventory\") pod \"376d3aeb-b569-4e4e-847a-762ed8f12b35\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.017212 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-ssh-key\") pod \"376d3aeb-b569-4e4e-847a-762ed8f12b35\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.017374 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-bootstrap-combined-ca-bundle\") pod \"376d3aeb-b569-4e4e-847a-762ed8f12b35\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.017419 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjjkr\" (UniqueName: \"kubernetes.io/projected/376d3aeb-b569-4e4e-847a-762ed8f12b35-kube-api-access-fjjkr\") pod \"376d3aeb-b569-4e4e-847a-762ed8f12b35\" (UID: \"376d3aeb-b569-4e4e-847a-762ed8f12b35\") " Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.022351 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "376d3aeb-b569-4e4e-847a-762ed8f12b35" (UID: "376d3aeb-b569-4e4e-847a-762ed8f12b35"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.023132 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/376d3aeb-b569-4e4e-847a-762ed8f12b35-kube-api-access-fjjkr" (OuterVolumeSpecName: "kube-api-access-fjjkr") pod "376d3aeb-b569-4e4e-847a-762ed8f12b35" (UID: "376d3aeb-b569-4e4e-847a-762ed8f12b35"). InnerVolumeSpecName "kube-api-access-fjjkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.061014 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-inventory" (OuterVolumeSpecName: "inventory") pod "376d3aeb-b569-4e4e-847a-762ed8f12b35" (UID: "376d3aeb-b569-4e4e-847a-762ed8f12b35"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.061378 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "376d3aeb-b569-4e4e-847a-762ed8f12b35" (UID: "376d3aeb-b569-4e4e-847a-762ed8f12b35"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.120550 4628 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.120590 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjjkr\" (UniqueName: \"kubernetes.io/projected/376d3aeb-b569-4e4e-847a-762ed8f12b35-kube-api-access-fjjkr\") on node \"crc\" DevicePath \"\"" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.120605 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.120616 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/376d3aeb-b569-4e4e-847a-762ed8f12b35-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.490432 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" event={"ID":"376d3aeb-b569-4e4e-847a-762ed8f12b35","Type":"ContainerDied","Data":"727c33eee1c8e6c34f7c10bf7cb29f5ab56703718c3a75ce00ffa38e7677a281"} Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.490468 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="727c33eee1c8e6c34f7c10bf7cb29f5ab56703718c3a75ce00ffa38e7677a281" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.490472 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.587241 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd"] Dec 11 05:41:09 crc kubenswrapper[4628]: E1211 05:41:09.587601 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376d3aeb-b569-4e4e-847a-762ed8f12b35" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.587618 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="376d3aeb-b569-4e4e-847a-762ed8f12b35" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 11 05:41:09 crc kubenswrapper[4628]: E1211 05:41:09.587628 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8af5a90-5182-4e1c-83bb-f792c85412de" containerName="extract-content" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.587634 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8af5a90-5182-4e1c-83bb-f792c85412de" containerName="extract-content" Dec 11 05:41:09 crc kubenswrapper[4628]: E1211 05:41:09.587642 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" containerName="registry-server" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.587648 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" containerName="registry-server" Dec 11 05:41:09 crc kubenswrapper[4628]: E1211 05:41:09.587657 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8af5a90-5182-4e1c-83bb-f792c85412de" containerName="registry-server" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.587664 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8af5a90-5182-4e1c-83bb-f792c85412de" containerName="registry-server" Dec 11 05:41:09 crc kubenswrapper[4628]: E1211 05:41:09.587674 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" containerName="extract-content" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.587681 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" containerName="extract-content" Dec 11 05:41:09 crc kubenswrapper[4628]: E1211 05:41:09.587692 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" containerName="extract-utilities" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.587699 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" containerName="extract-utilities" Dec 11 05:41:09 crc kubenswrapper[4628]: E1211 05:41:09.587717 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8af5a90-5182-4e1c-83bb-f792c85412de" containerName="extract-utilities" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.587723 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8af5a90-5182-4e1c-83bb-f792c85412de" containerName="extract-utilities" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.587885 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="376d3aeb-b569-4e4e-847a-762ed8f12b35" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.587900 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8af5a90-5182-4e1c-83bb-f792c85412de" containerName="registry-server" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.587920 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4ac6d7-2a29-4b6d-bba1-7d3f6ffbd50d" containerName="registry-server" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.588501 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.590631 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.590666 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.591029 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.591295 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.597744 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd"] Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.731869 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwqtp\" (UniqueName: \"kubernetes.io/projected/4416beb7-730c-4898-b603-a123279eb238-kube-api-access-lwqtp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd\" (UID: \"4416beb7-730c-4898-b603-a123279eb238\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.731986 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4416beb7-730c-4898-b603-a123279eb238-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd\" (UID: \"4416beb7-730c-4898-b603-a123279eb238\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.732095 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4416beb7-730c-4898-b603-a123279eb238-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd\" (UID: \"4416beb7-730c-4898-b603-a123279eb238\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.833806 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwqtp\" (UniqueName: \"kubernetes.io/projected/4416beb7-730c-4898-b603-a123279eb238-kube-api-access-lwqtp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd\" (UID: \"4416beb7-730c-4898-b603-a123279eb238\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.833917 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4416beb7-730c-4898-b603-a123279eb238-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd\" (UID: \"4416beb7-730c-4898-b603-a123279eb238\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.833976 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4416beb7-730c-4898-b603-a123279eb238-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd\" (UID: \"4416beb7-730c-4898-b603-a123279eb238\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.840912 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4416beb7-730c-4898-b603-a123279eb238-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd\" (UID: \"4416beb7-730c-4898-b603-a123279eb238\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.845365 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4416beb7-730c-4898-b603-a123279eb238-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd\" (UID: \"4416beb7-730c-4898-b603-a123279eb238\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.858934 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwqtp\" (UniqueName: \"kubernetes.io/projected/4416beb7-730c-4898-b603-a123279eb238-kube-api-access-lwqtp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd\" (UID: \"4416beb7-730c-4898-b603-a123279eb238\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:41:09 crc kubenswrapper[4628]: I1211 05:41:09.907093 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:41:10 crc kubenswrapper[4628]: I1211 05:41:10.451239 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd"] Dec 11 05:41:10 crc kubenswrapper[4628]: I1211 05:41:10.501124 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" event={"ID":"4416beb7-730c-4898-b603-a123279eb238","Type":"ContainerStarted","Data":"31473863a4525bd6a301836e392baab929d2d66eeed3867caf0c056a1540db41"} Dec 11 05:41:11 crc kubenswrapper[4628]: I1211 05:41:11.518606 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" event={"ID":"4416beb7-730c-4898-b603-a123279eb238","Type":"ContainerStarted","Data":"45a76a9142263179449532e7e1f886ee5e04324db37ac9e9c3bd2f7e95faaa08"} Dec 11 05:41:11 crc kubenswrapper[4628]: I1211 05:41:11.539654 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" podStartSLOduration=2.053424996 podStartE2EDuration="2.539628497s" podCreationTimestamp="2025-12-11 05:41:09 +0000 UTC" firstStartedPulling="2025-12-11 05:41:10.464863779 +0000 UTC m=+1572.882210477" lastFinishedPulling="2025-12-11 05:41:10.95106724 +0000 UTC m=+1573.368413978" observedRunningTime="2025-12-11 05:41:11.53898233 +0000 UTC m=+1573.956329028" watchObservedRunningTime="2025-12-11 05:41:11.539628497 +0000 UTC m=+1573.956975195" Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.582405 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m2gb8"] Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.585417 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.607642 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m2gb8"] Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.691526 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5g95\" (UniqueName: \"kubernetes.io/projected/05ec8848-e239-4c28-b358-18e5c50a9fd5-kube-api-access-l5g95\") pod \"redhat-operators-m2gb8\" (UID: \"05ec8848-e239-4c28-b358-18e5c50a9fd5\") " pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.692276 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05ec8848-e239-4c28-b358-18e5c50a9fd5-catalog-content\") pod \"redhat-operators-m2gb8\" (UID: \"05ec8848-e239-4c28-b358-18e5c50a9fd5\") " pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.692478 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05ec8848-e239-4c28-b358-18e5c50a9fd5-utilities\") pod \"redhat-operators-m2gb8\" (UID: \"05ec8848-e239-4c28-b358-18e5c50a9fd5\") " pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.794397 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5g95\" (UniqueName: \"kubernetes.io/projected/05ec8848-e239-4c28-b358-18e5c50a9fd5-kube-api-access-l5g95\") pod \"redhat-operators-m2gb8\" (UID: \"05ec8848-e239-4c28-b358-18e5c50a9fd5\") " pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.794639 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05ec8848-e239-4c28-b358-18e5c50a9fd5-catalog-content\") pod \"redhat-operators-m2gb8\" (UID: \"05ec8848-e239-4c28-b358-18e5c50a9fd5\") " pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.794732 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05ec8848-e239-4c28-b358-18e5c50a9fd5-utilities\") pod \"redhat-operators-m2gb8\" (UID: \"05ec8848-e239-4c28-b358-18e5c50a9fd5\") " pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.795186 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05ec8848-e239-4c28-b358-18e5c50a9fd5-utilities\") pod \"redhat-operators-m2gb8\" (UID: \"05ec8848-e239-4c28-b358-18e5c50a9fd5\") " pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.795713 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05ec8848-e239-4c28-b358-18e5c50a9fd5-catalog-content\") pod \"redhat-operators-m2gb8\" (UID: \"05ec8848-e239-4c28-b358-18e5c50a9fd5\") " pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.823310 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5g95\" (UniqueName: \"kubernetes.io/projected/05ec8848-e239-4c28-b358-18e5c50a9fd5-kube-api-access-l5g95\") pod \"redhat-operators-m2gb8\" (UID: \"05ec8848-e239-4c28-b358-18e5c50a9fd5\") " pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:12 crc kubenswrapper[4628]: I1211 05:41:12.910351 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:13 crc kubenswrapper[4628]: I1211 05:41:13.464049 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m2gb8"] Dec 11 05:41:13 crc kubenswrapper[4628]: I1211 05:41:13.541812 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2gb8" event={"ID":"05ec8848-e239-4c28-b358-18e5c50a9fd5","Type":"ContainerStarted","Data":"a6d5a31907bbefc74751592de8fa3ca075341a691311a3c6b375cd4d9a007ff6"} Dec 11 05:41:14 crc kubenswrapper[4628]: I1211 05:41:14.559405 4628 generic.go:334] "Generic (PLEG): container finished" podID="05ec8848-e239-4c28-b358-18e5c50a9fd5" containerID="fceca00362284b2406497c5da44466fb6c0649bf0b04b726142aca8b75130672" exitCode=0 Dec 11 05:41:14 crc kubenswrapper[4628]: I1211 05:41:14.559729 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2gb8" event={"ID":"05ec8848-e239-4c28-b358-18e5c50a9fd5","Type":"ContainerDied","Data":"fceca00362284b2406497c5da44466fb6c0649bf0b04b726142aca8b75130672"} Dec 11 05:41:15 crc kubenswrapper[4628]: I1211 05:41:15.571070 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2gb8" event={"ID":"05ec8848-e239-4c28-b358-18e5c50a9fd5","Type":"ContainerStarted","Data":"4cec5dfd34f1f33b12af53ebab01b8b11933e5151a152564aa49a48b2b3aea74"} Dec 11 05:41:15 crc kubenswrapper[4628]: I1211 05:41:15.892867 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:41:15 crc kubenswrapper[4628]: E1211 05:41:15.893113 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:41:20 crc kubenswrapper[4628]: I1211 05:41:20.623406 4628 generic.go:334] "Generic (PLEG): container finished" podID="05ec8848-e239-4c28-b358-18e5c50a9fd5" containerID="4cec5dfd34f1f33b12af53ebab01b8b11933e5151a152564aa49a48b2b3aea74" exitCode=0 Dec 11 05:41:20 crc kubenswrapper[4628]: I1211 05:41:20.623462 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2gb8" event={"ID":"05ec8848-e239-4c28-b358-18e5c50a9fd5","Type":"ContainerDied","Data":"4cec5dfd34f1f33b12af53ebab01b8b11933e5151a152564aa49a48b2b3aea74"} Dec 11 05:41:21 crc kubenswrapper[4628]: I1211 05:41:21.640348 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2gb8" event={"ID":"05ec8848-e239-4c28-b358-18e5c50a9fd5","Type":"ContainerStarted","Data":"9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53"} Dec 11 05:41:21 crc kubenswrapper[4628]: I1211 05:41:21.675183 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m2gb8" podStartSLOduration=3.012015513 podStartE2EDuration="9.675156485s" podCreationTimestamp="2025-12-11 05:41:12 +0000 UTC" firstStartedPulling="2025-12-11 05:41:14.564406746 +0000 UTC m=+1576.981753444" lastFinishedPulling="2025-12-11 05:41:21.227547708 +0000 UTC m=+1583.644894416" observedRunningTime="2025-12-11 05:41:21.671149736 +0000 UTC m=+1584.088496434" watchObservedRunningTime="2025-12-11 05:41:21.675156485 +0000 UTC m=+1584.092503183" Dec 11 05:41:22 crc kubenswrapper[4628]: I1211 05:41:22.911359 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:22 crc kubenswrapper[4628]: I1211 05:41:22.911431 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:23 crc kubenswrapper[4628]: I1211 05:41:23.966071 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m2gb8" podUID="05ec8848-e239-4c28-b358-18e5c50a9fd5" containerName="registry-server" probeResult="failure" output=< Dec 11 05:41:23 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 05:41:23 crc kubenswrapper[4628]: > Dec 11 05:41:26 crc kubenswrapper[4628]: I1211 05:41:26.070574 4628 scope.go:117] "RemoveContainer" containerID="d655cb6ef20d42acc69695f35f24e60440857fa52c04308eed046368968a76ce" Dec 11 05:41:26 crc kubenswrapper[4628]: I1211 05:41:26.112246 4628 scope.go:117] "RemoveContainer" containerID="2ef618827cdb881bfe76bf1c683e0699f7a0cb3b6f729c0565f73ab04b9fb4b7" Dec 11 05:41:26 crc kubenswrapper[4628]: I1211 05:41:26.173407 4628 scope.go:117] "RemoveContainer" containerID="520f93202cb4c3be86306ac0ef9a3f0ec65b385db5c541efe42cc93aa795bde3" Dec 11 05:41:26 crc kubenswrapper[4628]: I1211 05:41:26.214709 4628 scope.go:117] "RemoveContainer" containerID="64b1947fedcd6a1a6e6e81a6e66951f505d75885d5b1ebc254bdae52874769b0" Dec 11 05:41:26 crc kubenswrapper[4628]: I1211 05:41:26.237306 4628 scope.go:117] "RemoveContainer" containerID="dba35f191d2ecfc5135c3769ee1a25f42bbe3da0164752941bf2604ba1af84c7" Dec 11 05:41:26 crc kubenswrapper[4628]: I1211 05:41:26.274765 4628 scope.go:117] "RemoveContainer" containerID="3836ed00ef5c73025e39977eb2120f3f5cf6f09ccc0bc2fa86b07297ea274b62" Dec 11 05:41:26 crc kubenswrapper[4628]: I1211 05:41:26.319369 4628 scope.go:117] "RemoveContainer" containerID="d3dabf5ed46d314ca946e16acce857c86397b012fa65749804022453003eced9" Dec 11 05:41:28 crc kubenswrapper[4628]: I1211 05:41:28.889622 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:41:28 crc kubenswrapper[4628]: E1211 05:41:28.890151 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:41:32 crc kubenswrapper[4628]: I1211 05:41:32.973004 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:33 crc kubenswrapper[4628]: I1211 05:41:33.025035 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:33 crc kubenswrapper[4628]: I1211 05:41:33.214094 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m2gb8"] Dec 11 05:41:34 crc kubenswrapper[4628]: I1211 05:41:34.049664 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-2pxdl"] Dec 11 05:41:34 crc kubenswrapper[4628]: I1211 05:41:34.058638 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-f632-account-create-update-cvcfq"] Dec 11 05:41:34 crc kubenswrapper[4628]: I1211 05:41:34.066359 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-cl2fq"] Dec 11 05:41:34 crc kubenswrapper[4628]: I1211 05:41:34.079390 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-2pxdl"] Dec 11 05:41:34 crc kubenswrapper[4628]: I1211 05:41:34.087303 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-f632-account-create-update-cvcfq"] Dec 11 05:41:34 crc kubenswrapper[4628]: I1211 05:41:34.094391 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-cl2fq"] Dec 11 05:41:34 crc kubenswrapper[4628]: I1211 05:41:34.762350 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m2gb8" podUID="05ec8848-e239-4c28-b358-18e5c50a9fd5" containerName="registry-server" containerID="cri-o://9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53" gracePeriod=2 Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.035900 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-xhlf8"] Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.055948 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-xhlf8"] Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.073035 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-01ba-account-create-update-wzhwm"] Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.082941 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-01ba-account-create-update-wzhwm"] Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.094163 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-bfc0-account-create-update-6lpng"] Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.109739 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-bfc0-account-create-update-6lpng"] Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.270614 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.340485 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05ec8848-e239-4c28-b358-18e5c50a9fd5-utilities\") pod \"05ec8848-e239-4c28-b358-18e5c50a9fd5\" (UID: \"05ec8848-e239-4c28-b358-18e5c50a9fd5\") " Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.341108 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05ec8848-e239-4c28-b358-18e5c50a9fd5-catalog-content\") pod \"05ec8848-e239-4c28-b358-18e5c50a9fd5\" (UID: \"05ec8848-e239-4c28-b358-18e5c50a9fd5\") " Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.341368 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5g95\" (UniqueName: \"kubernetes.io/projected/05ec8848-e239-4c28-b358-18e5c50a9fd5-kube-api-access-l5g95\") pod \"05ec8848-e239-4c28-b358-18e5c50a9fd5\" (UID: \"05ec8848-e239-4c28-b358-18e5c50a9fd5\") " Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.341549 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05ec8848-e239-4c28-b358-18e5c50a9fd5-utilities" (OuterVolumeSpecName: "utilities") pod "05ec8848-e239-4c28-b358-18e5c50a9fd5" (UID: "05ec8848-e239-4c28-b358-18e5c50a9fd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.342683 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05ec8848-e239-4c28-b358-18e5c50a9fd5-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.347559 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05ec8848-e239-4c28-b358-18e5c50a9fd5-kube-api-access-l5g95" (OuterVolumeSpecName: "kube-api-access-l5g95") pod "05ec8848-e239-4c28-b358-18e5c50a9fd5" (UID: "05ec8848-e239-4c28-b358-18e5c50a9fd5"). InnerVolumeSpecName "kube-api-access-l5g95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.445360 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5g95\" (UniqueName: \"kubernetes.io/projected/05ec8848-e239-4c28-b358-18e5c50a9fd5-kube-api-access-l5g95\") on node \"crc\" DevicePath \"\"" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.476539 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05ec8848-e239-4c28-b358-18e5c50a9fd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05ec8848-e239-4c28-b358-18e5c50a9fd5" (UID: "05ec8848-e239-4c28-b358-18e5c50a9fd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.546744 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05ec8848-e239-4c28-b358-18e5c50a9fd5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.777361 4628 generic.go:334] "Generic (PLEG): container finished" podID="05ec8848-e239-4c28-b358-18e5c50a9fd5" containerID="9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53" exitCode=0 Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.777413 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2gb8" event={"ID":"05ec8848-e239-4c28-b358-18e5c50a9fd5","Type":"ContainerDied","Data":"9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53"} Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.777465 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m2gb8" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.777499 4628 scope.go:117] "RemoveContainer" containerID="9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.777484 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m2gb8" event={"ID":"05ec8848-e239-4c28-b358-18e5c50a9fd5","Type":"ContainerDied","Data":"a6d5a31907bbefc74751592de8fa3ca075341a691311a3c6b375cd4d9a007ff6"} Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.823395 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m2gb8"] Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.827795 4628 scope.go:117] "RemoveContainer" containerID="4cec5dfd34f1f33b12af53ebab01b8b11933e5151a152564aa49a48b2b3aea74" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.834613 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m2gb8"] Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.847051 4628 scope.go:117] "RemoveContainer" containerID="fceca00362284b2406497c5da44466fb6c0649bf0b04b726142aca8b75130672" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.901302 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05ec8848-e239-4c28-b358-18e5c50a9fd5" path="/var/lib/kubelet/pods/05ec8848-e239-4c28-b358-18e5c50a9fd5/volumes" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.902208 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11763665-9dfb-4894-94f3-1729ee24848a" path="/var/lib/kubelet/pods/11763665-9dfb-4894-94f3-1729ee24848a/volumes" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.903310 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1480b26d-86ec-4157-ae9d-d3333ccc2932" path="/var/lib/kubelet/pods/1480b26d-86ec-4157-ae9d-d3333ccc2932/volumes" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.904654 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b35a721-4483-4f15-a0a4-b516b96f9c76" path="/var/lib/kubelet/pods/2b35a721-4483-4f15-a0a4-b516b96f9c76/volumes" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.905759 4628 scope.go:117] "RemoveContainer" containerID="9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.906032 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df1b74b-f6be-41c4-b9f1-6916553ef1d9" path="/var/lib/kubelet/pods/2df1b74b-f6be-41c4-b9f1-6916553ef1d9/volumes" Dec 11 05:41:35 crc kubenswrapper[4628]: E1211 05:41:35.906977 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53\": container with ID starting with 9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53 not found: ID does not exist" containerID="9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.907021 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53"} err="failed to get container status \"9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53\": rpc error: code = NotFound desc = could not find container \"9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53\": container with ID starting with 9b3a6316ed4a095d73c729930bf72dd497229ea0321d481eefdff82104db2b53 not found: ID does not exist" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.907048 4628 scope.go:117] "RemoveContainer" containerID="4cec5dfd34f1f33b12af53ebab01b8b11933e5151a152564aa49a48b2b3aea74" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.907240 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5167f66-289b-4976-b502-640a327fa7bc" path="/var/lib/kubelet/pods/b5167f66-289b-4976-b502-640a327fa7bc/volumes" Dec 11 05:41:35 crc kubenswrapper[4628]: E1211 05:41:35.907609 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cec5dfd34f1f33b12af53ebab01b8b11933e5151a152564aa49a48b2b3aea74\": container with ID starting with 4cec5dfd34f1f33b12af53ebab01b8b11933e5151a152564aa49a48b2b3aea74 not found: ID does not exist" containerID="4cec5dfd34f1f33b12af53ebab01b8b11933e5151a152564aa49a48b2b3aea74" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.907702 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cec5dfd34f1f33b12af53ebab01b8b11933e5151a152564aa49a48b2b3aea74"} err="failed to get container status \"4cec5dfd34f1f33b12af53ebab01b8b11933e5151a152564aa49a48b2b3aea74\": rpc error: code = NotFound desc = could not find container \"4cec5dfd34f1f33b12af53ebab01b8b11933e5151a152564aa49a48b2b3aea74\": container with ID starting with 4cec5dfd34f1f33b12af53ebab01b8b11933e5151a152564aa49a48b2b3aea74 not found: ID does not exist" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.907770 4628 scope.go:117] "RemoveContainer" containerID="fceca00362284b2406497c5da44466fb6c0649bf0b04b726142aca8b75130672" Dec 11 05:41:35 crc kubenswrapper[4628]: E1211 05:41:35.908341 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fceca00362284b2406497c5da44466fb6c0649bf0b04b726142aca8b75130672\": container with ID starting with fceca00362284b2406497c5da44466fb6c0649bf0b04b726142aca8b75130672 not found: ID does not exist" containerID="fceca00362284b2406497c5da44466fb6c0649bf0b04b726142aca8b75130672" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.908473 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fceca00362284b2406497c5da44466fb6c0649bf0b04b726142aca8b75130672"} err="failed to get container status \"fceca00362284b2406497c5da44466fb6c0649bf0b04b726142aca8b75130672\": rpc error: code = NotFound desc = could not find container \"fceca00362284b2406497c5da44466fb6c0649bf0b04b726142aca8b75130672\": container with ID starting with fceca00362284b2406497c5da44466fb6c0649bf0b04b726142aca8b75130672 not found: ID does not exist" Dec 11 05:41:35 crc kubenswrapper[4628]: I1211 05:41:35.909080 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13e137b-1a8a-4965-8f85-04ad2b5ff488" path="/var/lib/kubelet/pods/c13e137b-1a8a-4965-8f85-04ad2b5ff488/volumes" Dec 11 05:41:41 crc kubenswrapper[4628]: I1211 05:41:41.889289 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:41:41 crc kubenswrapper[4628]: E1211 05:41:41.890093 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:41:44 crc kubenswrapper[4628]: I1211 05:41:44.049538 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-p46mc"] Dec 11 05:41:44 crc kubenswrapper[4628]: I1211 05:41:44.063725 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-p46mc"] Dec 11 05:41:45 crc kubenswrapper[4628]: I1211 05:41:45.901637 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dcb4aef-66a4-452a-a29b-5d387373785e" path="/var/lib/kubelet/pods/4dcb4aef-66a4-452a-a29b-5d387373785e/volumes" Dec 11 05:41:52 crc kubenswrapper[4628]: I1211 05:41:52.892782 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:41:52 crc kubenswrapper[4628]: E1211 05:41:52.894056 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:42:01 crc kubenswrapper[4628]: I1211 05:42:01.050807 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ksb9p"] Dec 11 05:42:01 crc kubenswrapper[4628]: I1211 05:42:01.065280 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ksb9p"] Dec 11 05:42:01 crc kubenswrapper[4628]: I1211 05:42:01.904949 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c407a5-95e8-4036-becd-3522286435d5" path="/var/lib/kubelet/pods/98c407a5-95e8-4036-becd-3522286435d5/volumes" Dec 11 05:42:06 crc kubenswrapper[4628]: I1211 05:42:06.890683 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:42:06 crc kubenswrapper[4628]: E1211 05:42:06.892083 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:42:21 crc kubenswrapper[4628]: I1211 05:42:21.077242 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-zvwzz"] Dec 11 05:42:21 crc kubenswrapper[4628]: I1211 05:42:21.099361 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-zvwzz"] Dec 11 05:42:21 crc kubenswrapper[4628]: I1211 05:42:21.890967 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:42:21 crc kubenswrapper[4628]: E1211 05:42:21.891195 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:42:21 crc kubenswrapper[4628]: I1211 05:42:21.919900 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c53cf2b-ce22-43f3-88fa-4a91ea4131bc" path="/var/lib/kubelet/pods/8c53cf2b-ce22-43f3-88fa-4a91ea4131bc/volumes" Dec 11 05:42:26 crc kubenswrapper[4628]: I1211 05:42:26.526374 4628 scope.go:117] "RemoveContainer" containerID="76d0959098e92c7f8c31c62866c6a21ad529ac4fe2ec60f4e70927e4ca5f3805" Dec 11 05:42:26 crc kubenswrapper[4628]: I1211 05:42:26.552072 4628 scope.go:117] "RemoveContainer" containerID="f81e3b712c601ac3815979e6f05e26f4b5a91a2be5ec45cc68324b2d709acbbb" Dec 11 05:42:26 crc kubenswrapper[4628]: I1211 05:42:26.599217 4628 scope.go:117] "RemoveContainer" containerID="fdce9474a37802f20d7909fc13787fe95fe63b8a57ba9c19707a22b5c49b3a81" Dec 11 05:42:26 crc kubenswrapper[4628]: I1211 05:42:26.658323 4628 scope.go:117] "RemoveContainer" containerID="b2ff9d482808e0260af9e769ffb830176737c5e86d0a0cf19b631f2b64bdebb1" Dec 11 05:42:26 crc kubenswrapper[4628]: I1211 05:42:26.697171 4628 scope.go:117] "RemoveContainer" containerID="f88d95bf5233cd751b800a6bd4f135a25aa9dfd6ebd3dba962ce4cb44b5f561c" Dec 11 05:42:26 crc kubenswrapper[4628]: I1211 05:42:26.737525 4628 scope.go:117] "RemoveContainer" containerID="f5cb61ff69c4cc2f3f37f09a5f1c204ec02775c42134b14406f6a1af785698b2" Dec 11 05:42:26 crc kubenswrapper[4628]: I1211 05:42:26.785505 4628 scope.go:117] "RemoveContainer" containerID="5d9e0857c1105bf0db5e0cf3f0e2d1741fde853e1a390d3fc1294fb4422b07df" Dec 11 05:42:26 crc kubenswrapper[4628]: I1211 05:42:26.813030 4628 scope.go:117] "RemoveContainer" containerID="88689b8c1e3f9b317f9ab4b507b04d82e9452615ae03345e83d14d2ce1b1ad2e" Dec 11 05:42:26 crc kubenswrapper[4628]: I1211 05:42:26.837116 4628 scope.go:117] "RemoveContainer" containerID="108dc06401e668ce1649c20dbf21c7fc5873e3119ee79d0dab68e6588f7f0bf3" Dec 11 05:42:32 crc kubenswrapper[4628]: I1211 05:42:32.891386 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:42:32 crc kubenswrapper[4628]: E1211 05:42:32.892364 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:42:39 crc kubenswrapper[4628]: I1211 05:42:39.042143 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-nd79k"] Dec 11 05:42:39 crc kubenswrapper[4628]: I1211 05:42:39.052097 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-nd79k"] Dec 11 05:42:39 crc kubenswrapper[4628]: I1211 05:42:39.903801 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5" path="/var/lib/kubelet/pods/0f9b5e76-b5f7-4cfd-9a2a-748fef5a02b5/volumes" Dec 11 05:42:43 crc kubenswrapper[4628]: I1211 05:42:43.056423 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-79fmw"] Dec 11 05:42:43 crc kubenswrapper[4628]: I1211 05:42:43.064945 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8b969"] Dec 11 05:42:43 crc kubenswrapper[4628]: I1211 05:42:43.075695 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8b969"] Dec 11 05:42:43 crc kubenswrapper[4628]: I1211 05:42:43.083907 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-79fmw"] Dec 11 05:42:43 crc kubenswrapper[4628]: I1211 05:42:43.890181 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:42:43 crc kubenswrapper[4628]: E1211 05:42:43.890643 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:42:43 crc kubenswrapper[4628]: I1211 05:42:43.906314 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90c5df18-e257-4561-8148-8cebd4644e40" path="/var/lib/kubelet/pods/90c5df18-e257-4561-8148-8cebd4644e40/volumes" Dec 11 05:42:43 crc kubenswrapper[4628]: I1211 05:42:43.907783 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c" path="/var/lib/kubelet/pods/c2e2b798-8ec0-4378-8ff7-6689e7ed8e2c/volumes" Dec 11 05:42:58 crc kubenswrapper[4628]: I1211 05:42:58.890717 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:42:58 crc kubenswrapper[4628]: E1211 05:42:58.891408 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:43:03 crc kubenswrapper[4628]: I1211 05:43:03.035375 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-n2b6t"] Dec 11 05:43:03 crc kubenswrapper[4628]: I1211 05:43:03.043361 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-n2b6t"] Dec 11 05:43:03 crc kubenswrapper[4628]: I1211 05:43:03.904669 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38627c48-4a86-4721-874d-8f386ea24495" path="/var/lib/kubelet/pods/38627c48-4a86-4721-874d-8f386ea24495/volumes" Dec 11 05:43:09 crc kubenswrapper[4628]: I1211 05:43:09.890059 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:43:09 crc kubenswrapper[4628]: E1211 05:43:09.892992 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:43:13 crc kubenswrapper[4628]: I1211 05:43:13.175630 4628 generic.go:334] "Generic (PLEG): container finished" podID="4416beb7-730c-4898-b603-a123279eb238" containerID="45a76a9142263179449532e7e1f886ee5e04324db37ac9e9c3bd2f7e95faaa08" exitCode=0 Dec 11 05:43:13 crc kubenswrapper[4628]: I1211 05:43:13.175688 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" event={"ID":"4416beb7-730c-4898-b603-a123279eb238","Type":"ContainerDied","Data":"45a76a9142263179449532e7e1f886ee5e04324db37ac9e9c3bd2f7e95faaa08"} Dec 11 05:43:14 crc kubenswrapper[4628]: I1211 05:43:14.635076 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:43:14 crc kubenswrapper[4628]: I1211 05:43:14.729101 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4416beb7-730c-4898-b603-a123279eb238-ssh-key\") pod \"4416beb7-730c-4898-b603-a123279eb238\" (UID: \"4416beb7-730c-4898-b603-a123279eb238\") " Dec 11 05:43:14 crc kubenswrapper[4628]: I1211 05:43:14.729291 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwqtp\" (UniqueName: \"kubernetes.io/projected/4416beb7-730c-4898-b603-a123279eb238-kube-api-access-lwqtp\") pod \"4416beb7-730c-4898-b603-a123279eb238\" (UID: \"4416beb7-730c-4898-b603-a123279eb238\") " Dec 11 05:43:14 crc kubenswrapper[4628]: I1211 05:43:14.729436 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4416beb7-730c-4898-b603-a123279eb238-inventory\") pod \"4416beb7-730c-4898-b603-a123279eb238\" (UID: \"4416beb7-730c-4898-b603-a123279eb238\") " Dec 11 05:43:14 crc kubenswrapper[4628]: I1211 05:43:14.739879 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4416beb7-730c-4898-b603-a123279eb238-kube-api-access-lwqtp" (OuterVolumeSpecName: "kube-api-access-lwqtp") pod "4416beb7-730c-4898-b603-a123279eb238" (UID: "4416beb7-730c-4898-b603-a123279eb238"). InnerVolumeSpecName "kube-api-access-lwqtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:43:14 crc kubenswrapper[4628]: I1211 05:43:14.757747 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4416beb7-730c-4898-b603-a123279eb238-inventory" (OuterVolumeSpecName: "inventory") pod "4416beb7-730c-4898-b603-a123279eb238" (UID: "4416beb7-730c-4898-b603-a123279eb238"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:43:14 crc kubenswrapper[4628]: I1211 05:43:14.766166 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4416beb7-730c-4898-b603-a123279eb238-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4416beb7-730c-4898-b603-a123279eb238" (UID: "4416beb7-730c-4898-b603-a123279eb238"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:43:14 crc kubenswrapper[4628]: I1211 05:43:14.832323 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4416beb7-730c-4898-b603-a123279eb238-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:43:14 crc kubenswrapper[4628]: I1211 05:43:14.832376 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwqtp\" (UniqueName: \"kubernetes.io/projected/4416beb7-730c-4898-b603-a123279eb238-kube-api-access-lwqtp\") on node \"crc\" DevicePath \"\"" Dec 11 05:43:14 crc kubenswrapper[4628]: I1211 05:43:14.832398 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4416beb7-730c-4898-b603-a123279eb238-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.195997 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" event={"ID":"4416beb7-730c-4898-b603-a123279eb238","Type":"ContainerDied","Data":"31473863a4525bd6a301836e392baab929d2d66eeed3867caf0c056a1540db41"} Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.196042 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31473863a4525bd6a301836e392baab929d2d66eeed3867caf0c056a1540db41" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.196071 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.396429 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg"] Dec 11 05:43:15 crc kubenswrapper[4628]: E1211 05:43:15.397070 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ec8848-e239-4c28-b358-18e5c50a9fd5" containerName="registry-server" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.397096 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ec8848-e239-4c28-b358-18e5c50a9fd5" containerName="registry-server" Dec 11 05:43:15 crc kubenswrapper[4628]: E1211 05:43:15.397117 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ec8848-e239-4c28-b358-18e5c50a9fd5" containerName="extract-utilities" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.397126 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ec8848-e239-4c28-b358-18e5c50a9fd5" containerName="extract-utilities" Dec 11 05:43:15 crc kubenswrapper[4628]: E1211 05:43:15.397150 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4416beb7-730c-4898-b603-a123279eb238" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.397159 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="4416beb7-730c-4898-b603-a123279eb238" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 11 05:43:15 crc kubenswrapper[4628]: E1211 05:43:15.397178 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ec8848-e239-4c28-b358-18e5c50a9fd5" containerName="extract-content" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.397186 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ec8848-e239-4c28-b358-18e5c50a9fd5" containerName="extract-content" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.397454 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="05ec8848-e239-4c28-b358-18e5c50a9fd5" containerName="registry-server" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.397495 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="4416beb7-730c-4898-b603-a123279eb238" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.398578 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.409707 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.409981 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.412566 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.414498 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.426043 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg"] Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.545734 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74ebd783-bcc7-4521-a9f2-450201f04c18-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg\" (UID: \"74ebd783-bcc7-4521-a9f2-450201f04c18\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.545828 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74ebd783-bcc7-4521-a9f2-450201f04c18-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg\" (UID: \"74ebd783-bcc7-4521-a9f2-450201f04c18\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.545915 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brz5v\" (UniqueName: \"kubernetes.io/projected/74ebd783-bcc7-4521-a9f2-450201f04c18-kube-api-access-brz5v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg\" (UID: \"74ebd783-bcc7-4521-a9f2-450201f04c18\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.647957 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brz5v\" (UniqueName: \"kubernetes.io/projected/74ebd783-bcc7-4521-a9f2-450201f04c18-kube-api-access-brz5v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg\" (UID: \"74ebd783-bcc7-4521-a9f2-450201f04c18\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.648366 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74ebd783-bcc7-4521-a9f2-450201f04c18-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg\" (UID: \"74ebd783-bcc7-4521-a9f2-450201f04c18\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.648902 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74ebd783-bcc7-4521-a9f2-450201f04c18-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg\" (UID: \"74ebd783-bcc7-4521-a9f2-450201f04c18\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.653511 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74ebd783-bcc7-4521-a9f2-450201f04c18-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg\" (UID: \"74ebd783-bcc7-4521-a9f2-450201f04c18\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.661532 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74ebd783-bcc7-4521-a9f2-450201f04c18-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg\" (UID: \"74ebd783-bcc7-4521-a9f2-450201f04c18\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.678466 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brz5v\" (UniqueName: \"kubernetes.io/projected/74ebd783-bcc7-4521-a9f2-450201f04c18-kube-api-access-brz5v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg\" (UID: \"74ebd783-bcc7-4521-a9f2-450201f04c18\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:43:15 crc kubenswrapper[4628]: I1211 05:43:15.754591 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:43:16 crc kubenswrapper[4628]: I1211 05:43:16.359427 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg"] Dec 11 05:43:17 crc kubenswrapper[4628]: I1211 05:43:17.214283 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" event={"ID":"74ebd783-bcc7-4521-a9f2-450201f04c18","Type":"ContainerStarted","Data":"e299bd8d34af2ae14bbab03ffbd6852cc4244477dfde76a369e388e7304130c5"} Dec 11 05:43:17 crc kubenswrapper[4628]: I1211 05:43:17.214745 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" event={"ID":"74ebd783-bcc7-4521-a9f2-450201f04c18","Type":"ContainerStarted","Data":"0c50be68470053cfa76ce45fc0d3c9cfb7af55dbca61ccadcafac099e756faa5"} Dec 11 05:43:17 crc kubenswrapper[4628]: I1211 05:43:17.237157 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" podStartSLOduration=1.7369519169999998 podStartE2EDuration="2.237132778s" podCreationTimestamp="2025-12-11 05:43:15 +0000 UTC" firstStartedPulling="2025-12-11 05:43:16.374731065 +0000 UTC m=+1698.792077763" lastFinishedPulling="2025-12-11 05:43:16.874911886 +0000 UTC m=+1699.292258624" observedRunningTime="2025-12-11 05:43:17.230763345 +0000 UTC m=+1699.648110053" watchObservedRunningTime="2025-12-11 05:43:17.237132778 +0000 UTC m=+1699.654479476" Dec 11 05:43:20 crc kubenswrapper[4628]: I1211 05:43:20.889636 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:43:20 crc kubenswrapper[4628]: E1211 05:43:20.891285 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:43:27 crc kubenswrapper[4628]: I1211 05:43:27.035940 4628 scope.go:117] "RemoveContainer" containerID="9129f9730171d9f262b1e72ab8699858bf1860bac87e69c40b0ca3c7700d1323" Dec 11 05:43:27 crc kubenswrapper[4628]: I1211 05:43:27.069294 4628 scope.go:117] "RemoveContainer" containerID="2f25c4c977c6a6e912a4fa4df7d2d12e1e16464d0b916865215eb5faffa036fe" Dec 11 05:43:27 crc kubenswrapper[4628]: I1211 05:43:27.153106 4628 scope.go:117] "RemoveContainer" containerID="f36432b45933f8a8ac33dbaa563e189156370f2eaac211d62b706a9225ef3346" Dec 11 05:43:27 crc kubenswrapper[4628]: I1211 05:43:27.189089 4628 scope.go:117] "RemoveContainer" containerID="4d9ea5e68dc77d1e29479862cc468b74fd29f30f0baa8471daf7986a7f91870f" Dec 11 05:43:31 crc kubenswrapper[4628]: I1211 05:43:31.889877 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:43:31 crc kubenswrapper[4628]: E1211 05:43:31.890751 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:43:43 crc kubenswrapper[4628]: I1211 05:43:43.890151 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:43:43 crc kubenswrapper[4628]: E1211 05:43:43.890660 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:43:52 crc kubenswrapper[4628]: I1211 05:43:52.061926 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-clb84"] Dec 11 05:43:52 crc kubenswrapper[4628]: I1211 05:43:52.076040 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-7ppf8"] Dec 11 05:43:52 crc kubenswrapper[4628]: I1211 05:43:52.090750 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-024a-account-create-update-b9tx5"] Dec 11 05:43:52 crc kubenswrapper[4628]: I1211 05:43:52.103874 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-clb84"] Dec 11 05:43:52 crc kubenswrapper[4628]: I1211 05:43:52.113114 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-024a-account-create-update-b9tx5"] Dec 11 05:43:52 crc kubenswrapper[4628]: I1211 05:43:52.120944 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-7ppf8"] Dec 11 05:43:53 crc kubenswrapper[4628]: I1211 05:43:53.032839 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-24f1-account-create-update-qfv25"] Dec 11 05:43:53 crc kubenswrapper[4628]: I1211 05:43:53.042109 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5519-account-create-update-nfplp"] Dec 11 05:43:53 crc kubenswrapper[4628]: I1211 05:43:53.050983 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-csbvp"] Dec 11 05:43:53 crc kubenswrapper[4628]: I1211 05:43:53.062201 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-24f1-account-create-update-qfv25"] Dec 11 05:43:53 crc kubenswrapper[4628]: I1211 05:43:53.070162 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5519-account-create-update-nfplp"] Dec 11 05:43:53 crc kubenswrapper[4628]: I1211 05:43:53.077983 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-csbvp"] Dec 11 05:43:53 crc kubenswrapper[4628]: I1211 05:43:53.906042 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b" path="/var/lib/kubelet/pods/2cf5ebbc-dbd5-415e-a0fd-f0c6b1a30d1b/volumes" Dec 11 05:43:53 crc kubenswrapper[4628]: I1211 05:43:53.908194 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e8b0526-5598-4458-8e28-c43557f08cf9" path="/var/lib/kubelet/pods/4e8b0526-5598-4458-8e28-c43557f08cf9/volumes" Dec 11 05:43:53 crc kubenswrapper[4628]: I1211 05:43:53.910673 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7642b050-2c3e-4a3d-bc5d-b5e007cf316f" path="/var/lib/kubelet/pods/7642b050-2c3e-4a3d-bc5d-b5e007cf316f/volumes" Dec 11 05:43:53 crc kubenswrapper[4628]: I1211 05:43:53.912990 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb00fa79-0866-4f48-b001-3c05352e47aa" path="/var/lib/kubelet/pods/bb00fa79-0866-4f48-b001-3c05352e47aa/volumes" Dec 11 05:43:53 crc kubenswrapper[4628]: I1211 05:43:53.917533 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3eb6441-7841-43ef-9036-08e2b3d43ed2" path="/var/lib/kubelet/pods/c3eb6441-7841-43ef-9036-08e2b3d43ed2/volumes" Dec 11 05:43:53 crc kubenswrapper[4628]: I1211 05:43:53.919505 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4dc7021-e4d0-4791-9760-4056d74989ad" path="/var/lib/kubelet/pods/d4dc7021-e4d0-4791-9760-4056d74989ad/volumes" Dec 11 05:43:56 crc kubenswrapper[4628]: I1211 05:43:56.889493 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:43:56 crc kubenswrapper[4628]: E1211 05:43:56.890458 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:44:10 crc kubenswrapper[4628]: I1211 05:44:10.889937 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:44:10 crc kubenswrapper[4628]: E1211 05:44:10.891117 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:44:21 crc kubenswrapper[4628]: I1211 05:44:21.890066 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:44:21 crc kubenswrapper[4628]: E1211 05:44:21.891038 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:44:24 crc kubenswrapper[4628]: I1211 05:44:24.053366 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k6kds"] Dec 11 05:44:24 crc kubenswrapper[4628]: I1211 05:44:24.064193 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k6kds"] Dec 11 05:44:25 crc kubenswrapper[4628]: I1211 05:44:25.899826 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58137696-e08d-4d26-ba22-d0fdb614485b" path="/var/lib/kubelet/pods/58137696-e08d-4d26-ba22-d0fdb614485b/volumes" Dec 11 05:44:27 crc kubenswrapper[4628]: I1211 05:44:27.299467 4628 scope.go:117] "RemoveContainer" containerID="329660e4821c297605c9d21981de80333adb9b65fd18540e9b3eb13602643569" Dec 11 05:44:27 crc kubenswrapper[4628]: I1211 05:44:27.335012 4628 scope.go:117] "RemoveContainer" containerID="c2761e05717dbb3b4597a9e322f788ac9de6cba83cab188acb66e2fdbf827e36" Dec 11 05:44:27 crc kubenswrapper[4628]: I1211 05:44:27.385160 4628 scope.go:117] "RemoveContainer" containerID="91d14f67d5473739a1fe6ef267a127788dcf7f4b06fbd3c8133b42eaa6bb21c4" Dec 11 05:44:27 crc kubenswrapper[4628]: I1211 05:44:27.427024 4628 scope.go:117] "RemoveContainer" containerID="f814a17eaba6c66b5c4fc7918b5ed45d24e734b6aa1ba1d977e6859c4161547d" Dec 11 05:44:27 crc kubenswrapper[4628]: I1211 05:44:27.482896 4628 scope.go:117] "RemoveContainer" containerID="bf7a6621a5d2fc28767cc644433bddbabdae8defb17a6efe742d4e9ecbb29f44" Dec 11 05:44:27 crc kubenswrapper[4628]: I1211 05:44:27.532913 4628 scope.go:117] "RemoveContainer" containerID="acf62138e38a1eec9a1a8034713f4c840aa494d78ac1bd65b44df018f272e4ca" Dec 11 05:44:27 crc kubenswrapper[4628]: I1211 05:44:27.578543 4628 scope.go:117] "RemoveContainer" containerID="046dc26b9a5c282cbf7679fa1ed8db17517608d8f12451328b5bd0ae51a7888c" Dec 11 05:44:34 crc kubenswrapper[4628]: I1211 05:44:34.890836 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:44:34 crc kubenswrapper[4628]: E1211 05:44:34.891897 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:44:38 crc kubenswrapper[4628]: I1211 05:44:38.945063 4628 generic.go:334] "Generic (PLEG): container finished" podID="74ebd783-bcc7-4521-a9f2-450201f04c18" containerID="e299bd8d34af2ae14bbab03ffbd6852cc4244477dfde76a369e388e7304130c5" exitCode=0 Dec 11 05:44:38 crc kubenswrapper[4628]: I1211 05:44:38.945096 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" event={"ID":"74ebd783-bcc7-4521-a9f2-450201f04c18","Type":"ContainerDied","Data":"e299bd8d34af2ae14bbab03ffbd6852cc4244477dfde76a369e388e7304130c5"} Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.392691 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.559256 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74ebd783-bcc7-4521-a9f2-450201f04c18-inventory\") pod \"74ebd783-bcc7-4521-a9f2-450201f04c18\" (UID: \"74ebd783-bcc7-4521-a9f2-450201f04c18\") " Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.559439 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74ebd783-bcc7-4521-a9f2-450201f04c18-ssh-key\") pod \"74ebd783-bcc7-4521-a9f2-450201f04c18\" (UID: \"74ebd783-bcc7-4521-a9f2-450201f04c18\") " Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.559552 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brz5v\" (UniqueName: \"kubernetes.io/projected/74ebd783-bcc7-4521-a9f2-450201f04c18-kube-api-access-brz5v\") pod \"74ebd783-bcc7-4521-a9f2-450201f04c18\" (UID: \"74ebd783-bcc7-4521-a9f2-450201f04c18\") " Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.566607 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74ebd783-bcc7-4521-a9f2-450201f04c18-kube-api-access-brz5v" (OuterVolumeSpecName: "kube-api-access-brz5v") pod "74ebd783-bcc7-4521-a9f2-450201f04c18" (UID: "74ebd783-bcc7-4521-a9f2-450201f04c18"). InnerVolumeSpecName "kube-api-access-brz5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.591524 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ebd783-bcc7-4521-a9f2-450201f04c18-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "74ebd783-bcc7-4521-a9f2-450201f04c18" (UID: "74ebd783-bcc7-4521-a9f2-450201f04c18"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.613475 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ebd783-bcc7-4521-a9f2-450201f04c18-inventory" (OuterVolumeSpecName: "inventory") pod "74ebd783-bcc7-4521-a9f2-450201f04c18" (UID: "74ebd783-bcc7-4521-a9f2-450201f04c18"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.662424 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brz5v\" (UniqueName: \"kubernetes.io/projected/74ebd783-bcc7-4521-a9f2-450201f04c18-kube-api-access-brz5v\") on node \"crc\" DevicePath \"\"" Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.662761 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74ebd783-bcc7-4521-a9f2-450201f04c18-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.662980 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74ebd783-bcc7-4521-a9f2-450201f04c18-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.974674 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" event={"ID":"74ebd783-bcc7-4521-a9f2-450201f04c18","Type":"ContainerDied","Data":"0c50be68470053cfa76ce45fc0d3c9cfb7af55dbca61ccadcafac099e756faa5"} Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.974735 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c50be68470053cfa76ce45fc0d3c9cfb7af55dbca61ccadcafac099e756faa5" Dec 11 05:44:40 crc kubenswrapper[4628]: I1211 05:44:40.974954 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.113904 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl"] Dec 11 05:44:41 crc kubenswrapper[4628]: E1211 05:44:41.114409 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ebd783-bcc7-4521-a9f2-450201f04c18" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.114435 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ebd783-bcc7-4521-a9f2-450201f04c18" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.114679 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ebd783-bcc7-4521-a9f2-450201f04c18" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.115570 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.119270 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.119670 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.119789 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.130465 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.142368 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl"] Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.276628 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kptqf\" (UniqueName: \"kubernetes.io/projected/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-kube-api-access-kptqf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl\" (UID: \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.276687 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl\" (UID: \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.276737 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl\" (UID: \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.378728 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl\" (UID: \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.378964 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kptqf\" (UniqueName: \"kubernetes.io/projected/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-kube-api-access-kptqf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl\" (UID: \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.379023 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl\" (UID: \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.385489 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl\" (UID: \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.385535 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl\" (UID: \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.403231 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kptqf\" (UniqueName: \"kubernetes.io/projected/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-kube-api-access-kptqf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl\" (UID: \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.448587 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.794017 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl"] Dec 11 05:44:41 crc kubenswrapper[4628]: I1211 05:44:41.987050 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" event={"ID":"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2","Type":"ContainerStarted","Data":"d074f0bcb4608f364bf760d2d4c8049ac6ab0ea6ed8db641ce19d294b16c0359"} Dec 11 05:44:42 crc kubenswrapper[4628]: I1211 05:44:42.995442 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" event={"ID":"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2","Type":"ContainerStarted","Data":"3cf2f7c4057c8c58cef42ff22196a3227a38335449f3bf9862b8926b0d328ff4"} Dec 11 05:44:43 crc kubenswrapper[4628]: I1211 05:44:43.018672 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" podStartSLOduration=1.5037332829999999 podStartE2EDuration="2.018656983s" podCreationTimestamp="2025-12-11 05:44:41 +0000 UTC" firstStartedPulling="2025-12-11 05:44:41.800703338 +0000 UTC m=+1784.218050046" lastFinishedPulling="2025-12-11 05:44:42.315627028 +0000 UTC m=+1784.732973746" observedRunningTime="2025-12-11 05:44:43.010807561 +0000 UTC m=+1785.428154269" watchObservedRunningTime="2025-12-11 05:44:43.018656983 +0000 UTC m=+1785.436003681" Dec 11 05:44:46 crc kubenswrapper[4628]: I1211 05:44:46.890108 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:44:46 crc kubenswrapper[4628]: E1211 05:44:46.890712 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:44:48 crc kubenswrapper[4628]: I1211 05:44:48.041985 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-7bm5v"] Dec 11 05:44:48 crc kubenswrapper[4628]: I1211 05:44:48.045353 4628 generic.go:334] "Generic (PLEG): container finished" podID="2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2" containerID="3cf2f7c4057c8c58cef42ff22196a3227a38335449f3bf9862b8926b0d328ff4" exitCode=0 Dec 11 05:44:48 crc kubenswrapper[4628]: I1211 05:44:48.045552 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" event={"ID":"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2","Type":"ContainerDied","Data":"3cf2f7c4057c8c58cef42ff22196a3227a38335449f3bf9862b8926b0d328ff4"} Dec 11 05:44:48 crc kubenswrapper[4628]: I1211 05:44:48.069796 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nsgqf"] Dec 11 05:44:48 crc kubenswrapper[4628]: I1211 05:44:48.089337 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nsgqf"] Dec 11 05:44:48 crc kubenswrapper[4628]: I1211 05:44:48.104462 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-7bm5v"] Dec 11 05:44:49 crc kubenswrapper[4628]: I1211 05:44:49.455935 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:49 crc kubenswrapper[4628]: I1211 05:44:49.638572 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kptqf\" (UniqueName: \"kubernetes.io/projected/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-kube-api-access-kptqf\") pod \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\" (UID: \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\") " Dec 11 05:44:49 crc kubenswrapper[4628]: I1211 05:44:49.639008 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-inventory\") pod \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\" (UID: \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\") " Dec 11 05:44:49 crc kubenswrapper[4628]: I1211 05:44:49.639150 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-ssh-key\") pod \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\" (UID: \"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2\") " Dec 11 05:44:49 crc kubenswrapper[4628]: I1211 05:44:49.673070 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-kube-api-access-kptqf" (OuterVolumeSpecName: "kube-api-access-kptqf") pod "2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2" (UID: "2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2"). InnerVolumeSpecName "kube-api-access-kptqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:44:49 crc kubenswrapper[4628]: I1211 05:44:49.674080 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2" (UID: "2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:44:49 crc kubenswrapper[4628]: I1211 05:44:49.708523 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-inventory" (OuterVolumeSpecName: "inventory") pod "2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2" (UID: "2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:44:49 crc kubenswrapper[4628]: I1211 05:44:49.741548 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:44:49 crc kubenswrapper[4628]: I1211 05:44:49.741584 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kptqf\" (UniqueName: \"kubernetes.io/projected/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-kube-api-access-kptqf\") on node \"crc\" DevicePath \"\"" Dec 11 05:44:49 crc kubenswrapper[4628]: I1211 05:44:49.741596 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:44:49 crc kubenswrapper[4628]: I1211 05:44:49.899697 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4df8deca-75c5-40b2-a666-4b1c6050c273" path="/var/lib/kubelet/pods/4df8deca-75c5-40b2-a666-4b1c6050c273/volumes" Dec 11 05:44:49 crc kubenswrapper[4628]: I1211 05:44:49.900453 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d090d098-9d30-4ee0-89e6-a408f1340325" path="/var/lib/kubelet/pods/d090d098-9d30-4ee0-89e6-a408f1340325/volumes" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.060498 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" event={"ID":"2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2","Type":"ContainerDied","Data":"d074f0bcb4608f364bf760d2d4c8049ac6ab0ea6ed8db641ce19d294b16c0359"} Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.060718 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.060828 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d074f0bcb4608f364bf760d2d4c8049ac6ab0ea6ed8db641ce19d294b16c0359" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.183784 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5"] Dec 11 05:44:50 crc kubenswrapper[4628]: E1211 05:44:50.184579 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.184699 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.184956 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.185631 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.187953 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.188110 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.188387 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.188500 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.206853 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5"] Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.251338 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7802f047-ef49-4339-8783-fa927f841103-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhbk5\" (UID: \"7802f047-ef49-4339-8783-fa927f841103\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.251407 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdn98\" (UniqueName: \"kubernetes.io/projected/7802f047-ef49-4339-8783-fa927f841103-kube-api-access-wdn98\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhbk5\" (UID: \"7802f047-ef49-4339-8783-fa927f841103\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.251441 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7802f047-ef49-4339-8783-fa927f841103-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhbk5\" (UID: \"7802f047-ef49-4339-8783-fa927f841103\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.353419 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7802f047-ef49-4339-8783-fa927f841103-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhbk5\" (UID: \"7802f047-ef49-4339-8783-fa927f841103\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.353501 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdn98\" (UniqueName: \"kubernetes.io/projected/7802f047-ef49-4339-8783-fa927f841103-kube-api-access-wdn98\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhbk5\" (UID: \"7802f047-ef49-4339-8783-fa927f841103\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.353550 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7802f047-ef49-4339-8783-fa927f841103-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhbk5\" (UID: \"7802f047-ef49-4339-8783-fa927f841103\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.357970 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7802f047-ef49-4339-8783-fa927f841103-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhbk5\" (UID: \"7802f047-ef49-4339-8783-fa927f841103\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.358739 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7802f047-ef49-4339-8783-fa927f841103-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhbk5\" (UID: \"7802f047-ef49-4339-8783-fa927f841103\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.382796 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdn98\" (UniqueName: \"kubernetes.io/projected/7802f047-ef49-4339-8783-fa927f841103-kube-api-access-wdn98\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jhbk5\" (UID: \"7802f047-ef49-4339-8783-fa927f841103\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:44:50 crc kubenswrapper[4628]: I1211 05:44:50.507300 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:44:51 crc kubenswrapper[4628]: I1211 05:44:51.133506 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5"] Dec 11 05:44:51 crc kubenswrapper[4628]: W1211 05:44:51.138730 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7802f047_ef49_4339_8783_fa927f841103.slice/crio-f6f2bfb992524d4a9100730568f680f97dc36eec775d6e5d04d6b84982b9e0d1 WatchSource:0}: Error finding container f6f2bfb992524d4a9100730568f680f97dc36eec775d6e5d04d6b84982b9e0d1: Status 404 returned error can't find the container with id f6f2bfb992524d4a9100730568f680f97dc36eec775d6e5d04d6b84982b9e0d1 Dec 11 05:44:52 crc kubenswrapper[4628]: I1211 05:44:52.124778 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" event={"ID":"7802f047-ef49-4339-8783-fa927f841103","Type":"ContainerStarted","Data":"f7b96b3ba7f8197957ed04f03ff74aed2833a6afa132077969db32971c006aef"} Dec 11 05:44:52 crc kubenswrapper[4628]: I1211 05:44:52.125597 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" event={"ID":"7802f047-ef49-4339-8783-fa927f841103","Type":"ContainerStarted","Data":"f6f2bfb992524d4a9100730568f680f97dc36eec775d6e5d04d6b84982b9e0d1"} Dec 11 05:44:52 crc kubenswrapper[4628]: I1211 05:44:52.144930 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" podStartSLOduration=1.726073459 podStartE2EDuration="2.144913112s" podCreationTimestamp="2025-12-11 05:44:50 +0000 UTC" firstStartedPulling="2025-12-11 05:44:51.141977159 +0000 UTC m=+1793.559323847" lastFinishedPulling="2025-12-11 05:44:51.560816802 +0000 UTC m=+1793.978163500" observedRunningTime="2025-12-11 05:44:52.143057382 +0000 UTC m=+1794.560404120" watchObservedRunningTime="2025-12-11 05:44:52.144913112 +0000 UTC m=+1794.562259810" Dec 11 05:44:58 crc kubenswrapper[4628]: I1211 05:44:58.892617 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:44:58 crc kubenswrapper[4628]: E1211 05:44:58.894251 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.137174 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv"] Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.138938 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.141528 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.148657 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.157646 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv"] Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.255608 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-secret-volume\") pod \"collect-profiles-29423865-6h8xv\" (UID: \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.256120 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-config-volume\") pod \"collect-profiles-29423865-6h8xv\" (UID: \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.256268 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qmcz\" (UniqueName: \"kubernetes.io/projected/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-kube-api-access-7qmcz\") pod \"collect-profiles-29423865-6h8xv\" (UID: \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.357469 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-config-volume\") pod \"collect-profiles-29423865-6h8xv\" (UID: \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.357581 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qmcz\" (UniqueName: \"kubernetes.io/projected/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-kube-api-access-7qmcz\") pod \"collect-profiles-29423865-6h8xv\" (UID: \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.357678 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-secret-volume\") pod \"collect-profiles-29423865-6h8xv\" (UID: \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.358648 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-config-volume\") pod \"collect-profiles-29423865-6h8xv\" (UID: \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.363282 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-secret-volume\") pod \"collect-profiles-29423865-6h8xv\" (UID: \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.376693 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qmcz\" (UniqueName: \"kubernetes.io/projected/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-kube-api-access-7qmcz\") pod \"collect-profiles-29423865-6h8xv\" (UID: \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.470561 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:00 crc kubenswrapper[4628]: I1211 05:45:00.945967 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv"] Dec 11 05:45:01 crc kubenswrapper[4628]: I1211 05:45:01.304610 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" event={"ID":"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0","Type":"ContainerStarted","Data":"0663e4609851b140d73b7f7047af6f64934ff3a538afe9374f31e273207cb664"} Dec 11 05:45:01 crc kubenswrapper[4628]: I1211 05:45:01.304977 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" event={"ID":"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0","Type":"ContainerStarted","Data":"46c232a266a2a0675df825062f108988bb762b9564b20aaaf37f0e8ab02288b7"} Dec 11 05:45:01 crc kubenswrapper[4628]: I1211 05:45:01.352206 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" podStartSLOduration=1.35218475 podStartE2EDuration="1.35218475s" podCreationTimestamp="2025-12-11 05:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 05:45:01.348588843 +0000 UTC m=+1803.765935561" watchObservedRunningTime="2025-12-11 05:45:01.35218475 +0000 UTC m=+1803.769531448" Dec 11 05:45:02 crc kubenswrapper[4628]: I1211 05:45:02.315761 4628 generic.go:334] "Generic (PLEG): container finished" podID="2a70a7e6-67fc-41c4-ab90-96aa3f3411f0" containerID="0663e4609851b140d73b7f7047af6f64934ff3a538afe9374f31e273207cb664" exitCode=0 Dec 11 05:45:02 crc kubenswrapper[4628]: I1211 05:45:02.316115 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" event={"ID":"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0","Type":"ContainerDied","Data":"0663e4609851b140d73b7f7047af6f64934ff3a538afe9374f31e273207cb664"} Dec 11 05:45:03 crc kubenswrapper[4628]: I1211 05:45:03.664775 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:03 crc kubenswrapper[4628]: I1211 05:45:03.828408 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-secret-volume\") pod \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\" (UID: \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\") " Dec 11 05:45:03 crc kubenswrapper[4628]: I1211 05:45:03.828868 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qmcz\" (UniqueName: \"kubernetes.io/projected/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-kube-api-access-7qmcz\") pod \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\" (UID: \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\") " Dec 11 05:45:03 crc kubenswrapper[4628]: I1211 05:45:03.828897 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-config-volume\") pod \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\" (UID: \"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0\") " Dec 11 05:45:03 crc kubenswrapper[4628]: I1211 05:45:03.829696 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-config-volume" (OuterVolumeSpecName: "config-volume") pod "2a70a7e6-67fc-41c4-ab90-96aa3f3411f0" (UID: "2a70a7e6-67fc-41c4-ab90-96aa3f3411f0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:45:03 crc kubenswrapper[4628]: I1211 05:45:03.834483 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2a70a7e6-67fc-41c4-ab90-96aa3f3411f0" (UID: "2a70a7e6-67fc-41c4-ab90-96aa3f3411f0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:45:03 crc kubenswrapper[4628]: I1211 05:45:03.835107 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-kube-api-access-7qmcz" (OuterVolumeSpecName: "kube-api-access-7qmcz") pod "2a70a7e6-67fc-41c4-ab90-96aa3f3411f0" (UID: "2a70a7e6-67fc-41c4-ab90-96aa3f3411f0"). InnerVolumeSpecName "kube-api-access-7qmcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:45:03 crc kubenswrapper[4628]: I1211 05:45:03.930997 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qmcz\" (UniqueName: \"kubernetes.io/projected/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-kube-api-access-7qmcz\") on node \"crc\" DevicePath \"\"" Dec 11 05:45:03 crc kubenswrapper[4628]: I1211 05:45:03.931038 4628 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 05:45:03 crc kubenswrapper[4628]: I1211 05:45:03.931052 4628 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 05:45:04 crc kubenswrapper[4628]: I1211 05:45:04.334015 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" event={"ID":"2a70a7e6-67fc-41c4-ab90-96aa3f3411f0","Type":"ContainerDied","Data":"46c232a266a2a0675df825062f108988bb762b9564b20aaaf37f0e8ab02288b7"} Dec 11 05:45:04 crc kubenswrapper[4628]: I1211 05:45:04.334062 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46c232a266a2a0675df825062f108988bb762b9564b20aaaf37f0e8ab02288b7" Dec 11 05:45:04 crc kubenswrapper[4628]: I1211 05:45:04.334087 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv" Dec 11 05:45:09 crc kubenswrapper[4628]: I1211 05:45:09.890292 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:45:09 crc kubenswrapper[4628]: E1211 05:45:09.891517 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:45:22 crc kubenswrapper[4628]: I1211 05:45:22.890901 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:45:22 crc kubenswrapper[4628]: E1211 05:45:22.892636 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:45:27 crc kubenswrapper[4628]: I1211 05:45:27.715923 4628 scope.go:117] "RemoveContainer" containerID="32140fba5d8f55d75c4386c2a272a4df60e1a31b54f45ba902474c0205177e5e" Dec 11 05:45:27 crc kubenswrapper[4628]: I1211 05:45:27.773541 4628 scope.go:117] "RemoveContainer" containerID="e0c7e6e0b9f64661b8ee1a96199acb85c920ee37827d52abac1e94d14daf375a" Dec 11 05:45:34 crc kubenswrapper[4628]: I1211 05:45:34.069526 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-z4bpl"] Dec 11 05:45:34 crc kubenswrapper[4628]: I1211 05:45:34.080233 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-z4bpl"] Dec 11 05:45:35 crc kubenswrapper[4628]: I1211 05:45:35.900596 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af" path="/var/lib/kubelet/pods/2c58618e-b18c-4ef8-8bc9-d3b08fc5b5af/volumes" Dec 11 05:45:36 crc kubenswrapper[4628]: I1211 05:45:36.889596 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:45:37 crc kubenswrapper[4628]: I1211 05:45:37.608789 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"9cede5e779a453211cb2fa7e39e6adea8ce5ae18e8668e6f7a6e7c92985fdcda"} Dec 11 05:45:40 crc kubenswrapper[4628]: I1211 05:45:40.640656 4628 generic.go:334] "Generic (PLEG): container finished" podID="7802f047-ef49-4339-8783-fa927f841103" containerID="f7b96b3ba7f8197957ed04f03ff74aed2833a6afa132077969db32971c006aef" exitCode=0 Dec 11 05:45:40 crc kubenswrapper[4628]: I1211 05:45:40.640721 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" event={"ID":"7802f047-ef49-4339-8783-fa927f841103","Type":"ContainerDied","Data":"f7b96b3ba7f8197957ed04f03ff74aed2833a6afa132077969db32971c006aef"} Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.068462 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.192507 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7802f047-ef49-4339-8783-fa927f841103-inventory\") pod \"7802f047-ef49-4339-8783-fa927f841103\" (UID: \"7802f047-ef49-4339-8783-fa927f841103\") " Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.192700 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdn98\" (UniqueName: \"kubernetes.io/projected/7802f047-ef49-4339-8783-fa927f841103-kube-api-access-wdn98\") pod \"7802f047-ef49-4339-8783-fa927f841103\" (UID: \"7802f047-ef49-4339-8783-fa927f841103\") " Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.192838 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7802f047-ef49-4339-8783-fa927f841103-ssh-key\") pod \"7802f047-ef49-4339-8783-fa927f841103\" (UID: \"7802f047-ef49-4339-8783-fa927f841103\") " Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.206983 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7802f047-ef49-4339-8783-fa927f841103-kube-api-access-wdn98" (OuterVolumeSpecName: "kube-api-access-wdn98") pod "7802f047-ef49-4339-8783-fa927f841103" (UID: "7802f047-ef49-4339-8783-fa927f841103"). InnerVolumeSpecName "kube-api-access-wdn98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.227087 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7802f047-ef49-4339-8783-fa927f841103-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7802f047-ef49-4339-8783-fa927f841103" (UID: "7802f047-ef49-4339-8783-fa927f841103"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.242222 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7802f047-ef49-4339-8783-fa927f841103-inventory" (OuterVolumeSpecName: "inventory") pod "7802f047-ef49-4339-8783-fa927f841103" (UID: "7802f047-ef49-4339-8783-fa927f841103"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.294633 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7802f047-ef49-4339-8783-fa927f841103-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.295405 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7802f047-ef49-4339-8783-fa927f841103-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.295500 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdn98\" (UniqueName: \"kubernetes.io/projected/7802f047-ef49-4339-8783-fa927f841103-kube-api-access-wdn98\") on node \"crc\" DevicePath \"\"" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.660102 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" event={"ID":"7802f047-ef49-4339-8783-fa927f841103","Type":"ContainerDied","Data":"f6f2bfb992524d4a9100730568f680f97dc36eec775d6e5d04d6b84982b9e0d1"} Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.660157 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6f2bfb992524d4a9100730568f680f97dc36eec775d6e5d04d6b84982b9e0d1" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.660184 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jhbk5" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.891600 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d"] Dec 11 05:45:42 crc kubenswrapper[4628]: E1211 05:45:42.891958 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a70a7e6-67fc-41c4-ab90-96aa3f3411f0" containerName="collect-profiles" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.891974 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a70a7e6-67fc-41c4-ab90-96aa3f3411f0" containerName="collect-profiles" Dec 11 05:45:42 crc kubenswrapper[4628]: E1211 05:45:42.892003 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7802f047-ef49-4339-8783-fa927f841103" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.892012 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="7802f047-ef49-4339-8783-fa927f841103" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.892193 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="7802f047-ef49-4339-8783-fa927f841103" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.892222 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a70a7e6-67fc-41c4-ab90-96aa3f3411f0" containerName="collect-profiles" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.892800 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.902815 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.903035 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.903091 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.922810 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d"] Dec 11 05:45:42 crc kubenswrapper[4628]: I1211 05:45:42.976200 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:45:43 crc kubenswrapper[4628]: I1211 05:45:43.015133 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9ntp\" (UniqueName: \"kubernetes.io/projected/6838fcd4-0c2b-4c92-880c-eb9029af8a00-kube-api-access-m9ntp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d\" (UID: \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:45:43 crc kubenswrapper[4628]: I1211 05:45:43.015518 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6838fcd4-0c2b-4c92-880c-eb9029af8a00-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d\" (UID: \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:45:43 crc kubenswrapper[4628]: I1211 05:45:43.015623 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6838fcd4-0c2b-4c92-880c-eb9029af8a00-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d\" (UID: \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:45:43 crc kubenswrapper[4628]: I1211 05:45:43.117562 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6838fcd4-0c2b-4c92-880c-eb9029af8a00-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d\" (UID: \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:45:43 crc kubenswrapper[4628]: I1211 05:45:43.117725 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6838fcd4-0c2b-4c92-880c-eb9029af8a00-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d\" (UID: \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:45:43 crc kubenswrapper[4628]: I1211 05:45:43.117778 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9ntp\" (UniqueName: \"kubernetes.io/projected/6838fcd4-0c2b-4c92-880c-eb9029af8a00-kube-api-access-m9ntp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d\" (UID: \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:45:43 crc kubenswrapper[4628]: I1211 05:45:43.125720 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6838fcd4-0c2b-4c92-880c-eb9029af8a00-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d\" (UID: \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:45:43 crc kubenswrapper[4628]: I1211 05:45:43.126292 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6838fcd4-0c2b-4c92-880c-eb9029af8a00-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d\" (UID: \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:45:43 crc kubenswrapper[4628]: I1211 05:45:43.141356 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9ntp\" (UniqueName: \"kubernetes.io/projected/6838fcd4-0c2b-4c92-880c-eb9029af8a00-kube-api-access-m9ntp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d\" (UID: \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:45:43 crc kubenswrapper[4628]: I1211 05:45:43.313282 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:45:43 crc kubenswrapper[4628]: I1211 05:45:43.804054 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d"] Dec 11 05:45:43 crc kubenswrapper[4628]: I1211 05:45:43.811077 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 05:45:44 crc kubenswrapper[4628]: I1211 05:45:44.675394 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" event={"ID":"6838fcd4-0c2b-4c92-880c-eb9029af8a00","Type":"ContainerStarted","Data":"327ab49980411d473d8a64ab3e4bcbeb18c412bc7ac2fce055548abca7ff6832"} Dec 11 05:45:44 crc kubenswrapper[4628]: I1211 05:45:44.675659 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" event={"ID":"6838fcd4-0c2b-4c92-880c-eb9029af8a00","Type":"ContainerStarted","Data":"386200bbbfddb1c6440ea457cc62eb3b110077927e7114e5da4aacee3b18394a"} Dec 11 05:45:44 crc kubenswrapper[4628]: I1211 05:45:44.691489 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" podStartSLOduration=2.11318688 podStartE2EDuration="2.691468685s" podCreationTimestamp="2025-12-11 05:45:42 +0000 UTC" firstStartedPulling="2025-12-11 05:45:43.810873853 +0000 UTC m=+1846.228220551" lastFinishedPulling="2025-12-11 05:45:44.389155618 +0000 UTC m=+1846.806502356" observedRunningTime="2025-12-11 05:45:44.688911497 +0000 UTC m=+1847.106258225" watchObservedRunningTime="2025-12-11 05:45:44.691468685 +0000 UTC m=+1847.108815393" Dec 11 05:46:27 crc kubenswrapper[4628]: I1211 05:46:27.883414 4628 scope.go:117] "RemoveContainer" containerID="8f835c5e648cc10d9e830c9db1308e78091fbd6c868b72b8ca834ea31b0a95c4" Dec 11 05:46:43 crc kubenswrapper[4628]: I1211 05:46:43.257647 4628 generic.go:334] "Generic (PLEG): container finished" podID="6838fcd4-0c2b-4c92-880c-eb9029af8a00" containerID="327ab49980411d473d8a64ab3e4bcbeb18c412bc7ac2fce055548abca7ff6832" exitCode=0 Dec 11 05:46:43 crc kubenswrapper[4628]: I1211 05:46:43.257725 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" event={"ID":"6838fcd4-0c2b-4c92-880c-eb9029af8a00","Type":"ContainerDied","Data":"327ab49980411d473d8a64ab3e4bcbeb18c412bc7ac2fce055548abca7ff6832"} Dec 11 05:46:44 crc kubenswrapper[4628]: I1211 05:46:44.688996 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:46:44 crc kubenswrapper[4628]: I1211 05:46:44.862158 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6838fcd4-0c2b-4c92-880c-eb9029af8a00-ssh-key\") pod \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\" (UID: \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\") " Dec 11 05:46:44 crc kubenswrapper[4628]: I1211 05:46:44.862281 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9ntp\" (UniqueName: \"kubernetes.io/projected/6838fcd4-0c2b-4c92-880c-eb9029af8a00-kube-api-access-m9ntp\") pod \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\" (UID: \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\") " Dec 11 05:46:44 crc kubenswrapper[4628]: I1211 05:46:44.862456 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6838fcd4-0c2b-4c92-880c-eb9029af8a00-inventory\") pod \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\" (UID: \"6838fcd4-0c2b-4c92-880c-eb9029af8a00\") " Dec 11 05:46:44 crc kubenswrapper[4628]: I1211 05:46:44.879026 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6838fcd4-0c2b-4c92-880c-eb9029af8a00-kube-api-access-m9ntp" (OuterVolumeSpecName: "kube-api-access-m9ntp") pod "6838fcd4-0c2b-4c92-880c-eb9029af8a00" (UID: "6838fcd4-0c2b-4c92-880c-eb9029af8a00"). InnerVolumeSpecName "kube-api-access-m9ntp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:46:44 crc kubenswrapper[4628]: I1211 05:46:44.893984 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6838fcd4-0c2b-4c92-880c-eb9029af8a00-inventory" (OuterVolumeSpecName: "inventory") pod "6838fcd4-0c2b-4c92-880c-eb9029af8a00" (UID: "6838fcd4-0c2b-4c92-880c-eb9029af8a00"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:46:44 crc kubenswrapper[4628]: I1211 05:46:44.916601 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6838fcd4-0c2b-4c92-880c-eb9029af8a00-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6838fcd4-0c2b-4c92-880c-eb9029af8a00" (UID: "6838fcd4-0c2b-4c92-880c-eb9029af8a00"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:46:44 crc kubenswrapper[4628]: I1211 05:46:44.965158 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9ntp\" (UniqueName: \"kubernetes.io/projected/6838fcd4-0c2b-4c92-880c-eb9029af8a00-kube-api-access-m9ntp\") on node \"crc\" DevicePath \"\"" Dec 11 05:46:44 crc kubenswrapper[4628]: I1211 05:46:44.965216 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6838fcd4-0c2b-4c92-880c-eb9029af8a00-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:46:44 crc kubenswrapper[4628]: I1211 05:46:44.965233 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6838fcd4-0c2b-4c92-880c-eb9029af8a00-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.282564 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" event={"ID":"6838fcd4-0c2b-4c92-880c-eb9029af8a00","Type":"ContainerDied","Data":"386200bbbfddb1c6440ea457cc62eb3b110077927e7114e5da4aacee3b18394a"} Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.282869 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="386200bbbfddb1c6440ea457cc62eb3b110077927e7114e5da4aacee3b18394a" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.282642 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.383358 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mfrfd"] Dec 11 05:46:45 crc kubenswrapper[4628]: E1211 05:46:45.384924 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6838fcd4-0c2b-4c92-880c-eb9029af8a00" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.384984 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="6838fcd4-0c2b-4c92-880c-eb9029af8a00" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.385802 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="6838fcd4-0c2b-4c92-880c-eb9029af8a00" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.387057 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.403456 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.403751 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.404711 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.405036 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.414803 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mfrfd"] Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.580246 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/179d12ab-f93f-4ce5-a674-deed794d48f0-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mfrfd\" (UID: \"179d12ab-f93f-4ce5-a674-deed794d48f0\") " pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.580673 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/179d12ab-f93f-4ce5-a674-deed794d48f0-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mfrfd\" (UID: \"179d12ab-f93f-4ce5-a674-deed794d48f0\") " pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.580721 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77t8n\" (UniqueName: \"kubernetes.io/projected/179d12ab-f93f-4ce5-a674-deed794d48f0-kube-api-access-77t8n\") pod \"ssh-known-hosts-edpm-deployment-mfrfd\" (UID: \"179d12ab-f93f-4ce5-a674-deed794d48f0\") " pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.682904 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/179d12ab-f93f-4ce5-a674-deed794d48f0-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mfrfd\" (UID: \"179d12ab-f93f-4ce5-a674-deed794d48f0\") " pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.682970 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77t8n\" (UniqueName: \"kubernetes.io/projected/179d12ab-f93f-4ce5-a674-deed794d48f0-kube-api-access-77t8n\") pod \"ssh-known-hosts-edpm-deployment-mfrfd\" (UID: \"179d12ab-f93f-4ce5-a674-deed794d48f0\") " pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.683123 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/179d12ab-f93f-4ce5-a674-deed794d48f0-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mfrfd\" (UID: \"179d12ab-f93f-4ce5-a674-deed794d48f0\") " pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.688238 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/179d12ab-f93f-4ce5-a674-deed794d48f0-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mfrfd\" (UID: \"179d12ab-f93f-4ce5-a674-deed794d48f0\") " pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.689514 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/179d12ab-f93f-4ce5-a674-deed794d48f0-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mfrfd\" (UID: \"179d12ab-f93f-4ce5-a674-deed794d48f0\") " pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.704466 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77t8n\" (UniqueName: \"kubernetes.io/projected/179d12ab-f93f-4ce5-a674-deed794d48f0-kube-api-access-77t8n\") pod \"ssh-known-hosts-edpm-deployment-mfrfd\" (UID: \"179d12ab-f93f-4ce5-a674-deed794d48f0\") " pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:45 crc kubenswrapper[4628]: I1211 05:46:45.732444 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:46 crc kubenswrapper[4628]: I1211 05:46:46.270518 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mfrfd"] Dec 11 05:46:46 crc kubenswrapper[4628]: I1211 05:46:46.294181 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" event={"ID":"179d12ab-f93f-4ce5-a674-deed794d48f0","Type":"ContainerStarted","Data":"eb588aa6371681fe4bc52f8ef3fdcb9113b2342f853a9a016ffa470d06e0f1d6"} Dec 11 05:46:47 crc kubenswrapper[4628]: I1211 05:46:47.306451 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" event={"ID":"179d12ab-f93f-4ce5-a674-deed794d48f0","Type":"ContainerStarted","Data":"20b4cfa4726946b8c797b2d8fb22813d5159d2b09512130626cc6c3402657ce3"} Dec 11 05:46:47 crc kubenswrapper[4628]: I1211 05:46:47.330290 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" podStartSLOduration=1.837449707 podStartE2EDuration="2.330266706s" podCreationTimestamp="2025-12-11 05:46:45 +0000 UTC" firstStartedPulling="2025-12-11 05:46:46.280966058 +0000 UTC m=+1908.698312766" lastFinishedPulling="2025-12-11 05:46:46.773783057 +0000 UTC m=+1909.191129765" observedRunningTime="2025-12-11 05:46:47.321343117 +0000 UTC m=+1909.738689815" watchObservedRunningTime="2025-12-11 05:46:47.330266706 +0000 UTC m=+1909.747613404" Dec 11 05:46:55 crc kubenswrapper[4628]: I1211 05:46:55.383300 4628 generic.go:334] "Generic (PLEG): container finished" podID="179d12ab-f93f-4ce5-a674-deed794d48f0" containerID="20b4cfa4726946b8c797b2d8fb22813d5159d2b09512130626cc6c3402657ce3" exitCode=0 Dec 11 05:46:55 crc kubenswrapper[4628]: I1211 05:46:55.383352 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" event={"ID":"179d12ab-f93f-4ce5-a674-deed794d48f0","Type":"ContainerDied","Data":"20b4cfa4726946b8c797b2d8fb22813d5159d2b09512130626cc6c3402657ce3"} Dec 11 05:46:56 crc kubenswrapper[4628]: I1211 05:46:56.826142 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:56 crc kubenswrapper[4628]: I1211 05:46:56.986394 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/179d12ab-f93f-4ce5-a674-deed794d48f0-ssh-key-openstack-edpm-ipam\") pod \"179d12ab-f93f-4ce5-a674-deed794d48f0\" (UID: \"179d12ab-f93f-4ce5-a674-deed794d48f0\") " Dec 11 05:46:56 crc kubenswrapper[4628]: I1211 05:46:56.986524 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77t8n\" (UniqueName: \"kubernetes.io/projected/179d12ab-f93f-4ce5-a674-deed794d48f0-kube-api-access-77t8n\") pod \"179d12ab-f93f-4ce5-a674-deed794d48f0\" (UID: \"179d12ab-f93f-4ce5-a674-deed794d48f0\") " Dec 11 05:46:56 crc kubenswrapper[4628]: I1211 05:46:56.986545 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/179d12ab-f93f-4ce5-a674-deed794d48f0-inventory-0\") pod \"179d12ab-f93f-4ce5-a674-deed794d48f0\" (UID: \"179d12ab-f93f-4ce5-a674-deed794d48f0\") " Dec 11 05:46:56 crc kubenswrapper[4628]: I1211 05:46:56.996631 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/179d12ab-f93f-4ce5-a674-deed794d48f0-kube-api-access-77t8n" (OuterVolumeSpecName: "kube-api-access-77t8n") pod "179d12ab-f93f-4ce5-a674-deed794d48f0" (UID: "179d12ab-f93f-4ce5-a674-deed794d48f0"). InnerVolumeSpecName "kube-api-access-77t8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.012521 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/179d12ab-f93f-4ce5-a674-deed794d48f0-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "179d12ab-f93f-4ce5-a674-deed794d48f0" (UID: "179d12ab-f93f-4ce5-a674-deed794d48f0"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.014613 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/179d12ab-f93f-4ce5-a674-deed794d48f0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "179d12ab-f93f-4ce5-a674-deed794d48f0" (UID: "179d12ab-f93f-4ce5-a674-deed794d48f0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.088471 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/179d12ab-f93f-4ce5-a674-deed794d48f0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.088503 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77t8n\" (UniqueName: \"kubernetes.io/projected/179d12ab-f93f-4ce5-a674-deed794d48f0-kube-api-access-77t8n\") on node \"crc\" DevicePath \"\"" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.088514 4628 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/179d12ab-f93f-4ce5-a674-deed794d48f0-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.413090 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" event={"ID":"179d12ab-f93f-4ce5-a674-deed794d48f0","Type":"ContainerDied","Data":"eb588aa6371681fe4bc52f8ef3fdcb9113b2342f853a9a016ffa470d06e0f1d6"} Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.413424 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb588aa6371681fe4bc52f8ef3fdcb9113b2342f853a9a016ffa470d06e0f1d6" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.413151 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mfrfd" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.492402 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj"] Dec 11 05:46:57 crc kubenswrapper[4628]: E1211 05:46:57.492860 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179d12ab-f93f-4ce5-a674-deed794d48f0" containerName="ssh-known-hosts-edpm-deployment" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.492884 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="179d12ab-f93f-4ce5-a674-deed794d48f0" containerName="ssh-known-hosts-edpm-deployment" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.493137 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="179d12ab-f93f-4ce5-a674-deed794d48f0" containerName="ssh-known-hosts-edpm-deployment" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.493906 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.504505 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.504569 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.504648 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.504830 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.513561 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj"] Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.597917 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/381944d6-a058-41f8-a452-82d1933510e3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjbxj\" (UID: \"381944d6-a058-41f8-a452-82d1933510e3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.598008 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5zw\" (UniqueName: \"kubernetes.io/projected/381944d6-a058-41f8-a452-82d1933510e3-kube-api-access-8w5zw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjbxj\" (UID: \"381944d6-a058-41f8-a452-82d1933510e3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.598048 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/381944d6-a058-41f8-a452-82d1933510e3-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjbxj\" (UID: \"381944d6-a058-41f8-a452-82d1933510e3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.699914 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/381944d6-a058-41f8-a452-82d1933510e3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjbxj\" (UID: \"381944d6-a058-41f8-a452-82d1933510e3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.700000 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5zw\" (UniqueName: \"kubernetes.io/projected/381944d6-a058-41f8-a452-82d1933510e3-kube-api-access-8w5zw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjbxj\" (UID: \"381944d6-a058-41f8-a452-82d1933510e3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.700057 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/381944d6-a058-41f8-a452-82d1933510e3-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjbxj\" (UID: \"381944d6-a058-41f8-a452-82d1933510e3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.710379 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/381944d6-a058-41f8-a452-82d1933510e3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjbxj\" (UID: \"381944d6-a058-41f8-a452-82d1933510e3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.710787 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/381944d6-a058-41f8-a452-82d1933510e3-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjbxj\" (UID: \"381944d6-a058-41f8-a452-82d1933510e3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.718603 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5zw\" (UniqueName: \"kubernetes.io/projected/381944d6-a058-41f8-a452-82d1933510e3-kube-api-access-8w5zw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qjbxj\" (UID: \"381944d6-a058-41f8-a452-82d1933510e3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:46:57 crc kubenswrapper[4628]: I1211 05:46:57.818856 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:46:58 crc kubenswrapper[4628]: I1211 05:46:58.362571 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj"] Dec 11 05:46:58 crc kubenswrapper[4628]: I1211 05:46:58.421591 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" event={"ID":"381944d6-a058-41f8-a452-82d1933510e3","Type":"ContainerStarted","Data":"73d242e80526b4962cc8ed959705e702de2d07500698bd606a09d89f1ca2901e"} Dec 11 05:46:58 crc kubenswrapper[4628]: I1211 05:46:58.911155 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:46:59 crc kubenswrapper[4628]: I1211 05:46:59.433778 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" event={"ID":"381944d6-a058-41f8-a452-82d1933510e3","Type":"ContainerStarted","Data":"8674252dea21e7bc49496a16953c8588b209b44f10dec2fb8f5ee76cf07a964d"} Dec 11 05:46:59 crc kubenswrapper[4628]: I1211 05:46:59.458272 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" podStartSLOduration=1.92088519 podStartE2EDuration="2.458252276s" podCreationTimestamp="2025-12-11 05:46:57 +0000 UTC" firstStartedPulling="2025-12-11 05:46:58.368088899 +0000 UTC m=+1920.785435607" lastFinishedPulling="2025-12-11 05:46:58.905455995 +0000 UTC m=+1921.322802693" observedRunningTime="2025-12-11 05:46:59.449823819 +0000 UTC m=+1921.867170517" watchObservedRunningTime="2025-12-11 05:46:59.458252276 +0000 UTC m=+1921.875598974" Dec 11 05:47:08 crc kubenswrapper[4628]: I1211 05:47:08.538739 4628 generic.go:334] "Generic (PLEG): container finished" podID="381944d6-a058-41f8-a452-82d1933510e3" containerID="8674252dea21e7bc49496a16953c8588b209b44f10dec2fb8f5ee76cf07a964d" exitCode=0 Dec 11 05:47:08 crc kubenswrapper[4628]: I1211 05:47:08.538877 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" event={"ID":"381944d6-a058-41f8-a452-82d1933510e3","Type":"ContainerDied","Data":"8674252dea21e7bc49496a16953c8588b209b44f10dec2fb8f5ee76cf07a964d"} Dec 11 05:47:09 crc kubenswrapper[4628]: I1211 05:47:09.993055 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.075701 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/381944d6-a058-41f8-a452-82d1933510e3-ssh-key\") pod \"381944d6-a058-41f8-a452-82d1933510e3\" (UID: \"381944d6-a058-41f8-a452-82d1933510e3\") " Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.075753 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/381944d6-a058-41f8-a452-82d1933510e3-inventory\") pod \"381944d6-a058-41f8-a452-82d1933510e3\" (UID: \"381944d6-a058-41f8-a452-82d1933510e3\") " Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.075913 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w5zw\" (UniqueName: \"kubernetes.io/projected/381944d6-a058-41f8-a452-82d1933510e3-kube-api-access-8w5zw\") pod \"381944d6-a058-41f8-a452-82d1933510e3\" (UID: \"381944d6-a058-41f8-a452-82d1933510e3\") " Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.089011 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/381944d6-a058-41f8-a452-82d1933510e3-kube-api-access-8w5zw" (OuterVolumeSpecName: "kube-api-access-8w5zw") pod "381944d6-a058-41f8-a452-82d1933510e3" (UID: "381944d6-a058-41f8-a452-82d1933510e3"). InnerVolumeSpecName "kube-api-access-8w5zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.103956 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381944d6-a058-41f8-a452-82d1933510e3-inventory" (OuterVolumeSpecName: "inventory") pod "381944d6-a058-41f8-a452-82d1933510e3" (UID: "381944d6-a058-41f8-a452-82d1933510e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.106832 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381944d6-a058-41f8-a452-82d1933510e3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "381944d6-a058-41f8-a452-82d1933510e3" (UID: "381944d6-a058-41f8-a452-82d1933510e3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.178467 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/381944d6-a058-41f8-a452-82d1933510e3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.178507 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/381944d6-a058-41f8-a452-82d1933510e3-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.178519 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w5zw\" (UniqueName: \"kubernetes.io/projected/381944d6-a058-41f8-a452-82d1933510e3-kube-api-access-8w5zw\") on node \"crc\" DevicePath \"\"" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.556488 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" event={"ID":"381944d6-a058-41f8-a452-82d1933510e3","Type":"ContainerDied","Data":"73d242e80526b4962cc8ed959705e702de2d07500698bd606a09d89f1ca2901e"} Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.556527 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73d242e80526b4962cc8ed959705e702de2d07500698bd606a09d89f1ca2901e" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.556538 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qjbxj" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.632834 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn"] Dec 11 05:47:10 crc kubenswrapper[4628]: E1211 05:47:10.633595 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381944d6-a058-41f8-a452-82d1933510e3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.633695 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="381944d6-a058-41f8-a452-82d1933510e3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.634031 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="381944d6-a058-41f8-a452-82d1933510e3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.634838 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.638125 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.638209 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.638764 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.643523 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn"] Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.652674 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.691970 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhbbr\" (UniqueName: \"kubernetes.io/projected/0e07dc05-985f-429b-8c55-221b86fb63be-kube-api-access-dhbbr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn\" (UID: \"0e07dc05-985f-429b-8c55-221b86fb63be\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.692427 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e07dc05-985f-429b-8c55-221b86fb63be-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn\" (UID: \"0e07dc05-985f-429b-8c55-221b86fb63be\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.692496 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e07dc05-985f-429b-8c55-221b86fb63be-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn\" (UID: \"0e07dc05-985f-429b-8c55-221b86fb63be\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.793884 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e07dc05-985f-429b-8c55-221b86fb63be-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn\" (UID: \"0e07dc05-985f-429b-8c55-221b86fb63be\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.793946 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e07dc05-985f-429b-8c55-221b86fb63be-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn\" (UID: \"0e07dc05-985f-429b-8c55-221b86fb63be\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.794003 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhbbr\" (UniqueName: \"kubernetes.io/projected/0e07dc05-985f-429b-8c55-221b86fb63be-kube-api-access-dhbbr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn\" (UID: \"0e07dc05-985f-429b-8c55-221b86fb63be\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.799770 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e07dc05-985f-429b-8c55-221b86fb63be-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn\" (UID: \"0e07dc05-985f-429b-8c55-221b86fb63be\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.810431 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e07dc05-985f-429b-8c55-221b86fb63be-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn\" (UID: \"0e07dc05-985f-429b-8c55-221b86fb63be\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.815115 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhbbr\" (UniqueName: \"kubernetes.io/projected/0e07dc05-985f-429b-8c55-221b86fb63be-kube-api-access-dhbbr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn\" (UID: \"0e07dc05-985f-429b-8c55-221b86fb63be\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:10 crc kubenswrapper[4628]: I1211 05:47:10.978198 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:11 crc kubenswrapper[4628]: I1211 05:47:11.517501 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn"] Dec 11 05:47:11 crc kubenswrapper[4628]: I1211 05:47:11.564971 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" event={"ID":"0e07dc05-985f-429b-8c55-221b86fb63be","Type":"ContainerStarted","Data":"01c49ef7803cc41df530fb91874ab405e8638b68e3ac96170f8dcb41cedb6d5b"} Dec 11 05:47:13 crc kubenswrapper[4628]: I1211 05:47:13.583410 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" event={"ID":"0e07dc05-985f-429b-8c55-221b86fb63be","Type":"ContainerStarted","Data":"9da7ddf5ff99f6ecb525c4e687f18e1bc5e3f15ce281e1cb30ddb56fa49b81bc"} Dec 11 05:47:13 crc kubenswrapper[4628]: I1211 05:47:13.614006 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" podStartSLOduration=2.8356147529999998 podStartE2EDuration="3.613988757s" podCreationTimestamp="2025-12-11 05:47:10 +0000 UTC" firstStartedPulling="2025-12-11 05:47:11.530122338 +0000 UTC m=+1933.947469036" lastFinishedPulling="2025-12-11 05:47:12.308496332 +0000 UTC m=+1934.725843040" observedRunningTime="2025-12-11 05:47:13.607930125 +0000 UTC m=+1936.025276823" watchObservedRunningTime="2025-12-11 05:47:13.613988757 +0000 UTC m=+1936.031335455" Dec 11 05:47:22 crc kubenswrapper[4628]: I1211 05:47:22.667822 4628 generic.go:334] "Generic (PLEG): container finished" podID="0e07dc05-985f-429b-8c55-221b86fb63be" containerID="9da7ddf5ff99f6ecb525c4e687f18e1bc5e3f15ce281e1cb30ddb56fa49b81bc" exitCode=0 Dec 11 05:47:22 crc kubenswrapper[4628]: I1211 05:47:22.667923 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" event={"ID":"0e07dc05-985f-429b-8c55-221b86fb63be","Type":"ContainerDied","Data":"9da7ddf5ff99f6ecb525c4e687f18e1bc5e3f15ce281e1cb30ddb56fa49b81bc"} Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.151745 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.321189 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhbbr\" (UniqueName: \"kubernetes.io/projected/0e07dc05-985f-429b-8c55-221b86fb63be-kube-api-access-dhbbr\") pod \"0e07dc05-985f-429b-8c55-221b86fb63be\" (UID: \"0e07dc05-985f-429b-8c55-221b86fb63be\") " Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.321256 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e07dc05-985f-429b-8c55-221b86fb63be-inventory\") pod \"0e07dc05-985f-429b-8c55-221b86fb63be\" (UID: \"0e07dc05-985f-429b-8c55-221b86fb63be\") " Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.321359 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e07dc05-985f-429b-8c55-221b86fb63be-ssh-key\") pod \"0e07dc05-985f-429b-8c55-221b86fb63be\" (UID: \"0e07dc05-985f-429b-8c55-221b86fb63be\") " Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.331486 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e07dc05-985f-429b-8c55-221b86fb63be-kube-api-access-dhbbr" (OuterVolumeSpecName: "kube-api-access-dhbbr") pod "0e07dc05-985f-429b-8c55-221b86fb63be" (UID: "0e07dc05-985f-429b-8c55-221b86fb63be"). InnerVolumeSpecName "kube-api-access-dhbbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.351647 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e07dc05-985f-429b-8c55-221b86fb63be-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0e07dc05-985f-429b-8c55-221b86fb63be" (UID: "0e07dc05-985f-429b-8c55-221b86fb63be"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.352223 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e07dc05-985f-429b-8c55-221b86fb63be-inventory" (OuterVolumeSpecName: "inventory") pod "0e07dc05-985f-429b-8c55-221b86fb63be" (UID: "0e07dc05-985f-429b-8c55-221b86fb63be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.424043 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhbbr\" (UniqueName: \"kubernetes.io/projected/0e07dc05-985f-429b-8c55-221b86fb63be-kube-api-access-dhbbr\") on node \"crc\" DevicePath \"\"" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.424080 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e07dc05-985f-429b-8c55-221b86fb63be-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.424091 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e07dc05-985f-429b-8c55-221b86fb63be-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.690629 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" event={"ID":"0e07dc05-985f-429b-8c55-221b86fb63be","Type":"ContainerDied","Data":"01c49ef7803cc41df530fb91874ab405e8638b68e3ac96170f8dcb41cedb6d5b"} Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.690699 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01c49ef7803cc41df530fb91874ab405e8638b68e3ac96170f8dcb41cedb6d5b" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.690766 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.782696 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v"] Dec 11 05:47:24 crc kubenswrapper[4628]: E1211 05:47:24.783332 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e07dc05-985f-429b-8c55-221b86fb63be" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.783354 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e07dc05-985f-429b-8c55-221b86fb63be" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.783581 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e07dc05-985f-429b-8c55-221b86fb63be" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.784275 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.787628 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.788932 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.789110 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.793326 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.793732 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.794043 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.794550 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.794954 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.805551 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v"] Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.932610 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.932668 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.932720 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.932754 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.932983 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.933035 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.933066 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.933108 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.933181 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dfrv\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-kube-api-access-8dfrv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.933264 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.933387 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.933452 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.933512 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:24 crc kubenswrapper[4628]: I1211 05:47:24.933610 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035450 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035531 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035569 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035629 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035666 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035706 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035732 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035776 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035809 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035881 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035903 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035926 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.035967 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.036002 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dfrv\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-kube-api-access-8dfrv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.041087 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.041633 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.042433 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.043201 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.044564 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.045479 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.047228 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.047560 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.048736 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.049370 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.051252 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.052082 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.052497 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.057953 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dfrv\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-kube-api-access-8dfrv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.105578 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.641466 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v"] Dec 11 05:47:25 crc kubenswrapper[4628]: I1211 05:47:25.700041 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" event={"ID":"cbe8ae7a-0268-477b-a232-fb89a86e6c30","Type":"ContainerStarted","Data":"4db2d58638e2c59d83177ddab297b3239af8a88b0dc396d0be7041f2d523d4d6"} Dec 11 05:47:26 crc kubenswrapper[4628]: I1211 05:47:26.713866 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" event={"ID":"cbe8ae7a-0268-477b-a232-fb89a86e6c30","Type":"ContainerStarted","Data":"86b08c463c6ffd570354d88fbd5098e9393be6af1d9a366c4e80d85c83095a96"} Dec 11 05:47:26 crc kubenswrapper[4628]: I1211 05:47:26.749055 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" podStartSLOduration=2.303211521 podStartE2EDuration="2.749032229s" podCreationTimestamp="2025-12-11 05:47:24 +0000 UTC" firstStartedPulling="2025-12-11 05:47:25.645491523 +0000 UTC m=+1948.062838241" lastFinishedPulling="2025-12-11 05:47:26.091312091 +0000 UTC m=+1948.508658949" observedRunningTime="2025-12-11 05:47:26.734047325 +0000 UTC m=+1949.151394033" watchObservedRunningTime="2025-12-11 05:47:26.749032229 +0000 UTC m=+1949.166378947" Dec 11 05:48:01 crc kubenswrapper[4628]: I1211 05:48:01.427321 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:48:01 crc kubenswrapper[4628]: I1211 05:48:01.427997 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:48:09 crc kubenswrapper[4628]: I1211 05:48:09.129965 4628 generic.go:334] "Generic (PLEG): container finished" podID="cbe8ae7a-0268-477b-a232-fb89a86e6c30" containerID="86b08c463c6ffd570354d88fbd5098e9393be6af1d9a366c4e80d85c83095a96" exitCode=0 Dec 11 05:48:09 crc kubenswrapper[4628]: I1211 05:48:09.130169 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" event={"ID":"cbe8ae7a-0268-477b-a232-fb89a86e6c30","Type":"ContainerDied","Data":"86b08c463c6ffd570354d88fbd5098e9393be6af1d9a366c4e80d85c83095a96"} Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.590222 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.679611 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-repo-setup-combined-ca-bundle\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.679701 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-nova-combined-ca-bundle\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.679751 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-bootstrap-combined-ca-bundle\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.679812 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dfrv\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-kube-api-access-8dfrv\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.679860 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-ovn-default-certs-0\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.679945 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.679969 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-ssh-key\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.680042 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.680069 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-inventory\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.680093 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-ovn-combined-ca-bundle\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.680128 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-neutron-metadata-combined-ca-bundle\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.680165 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.680203 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-libvirt-combined-ca-bundle\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.680226 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-telemetry-combined-ca-bundle\") pod \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\" (UID: \"cbe8ae7a-0268-477b-a232-fb89a86e6c30\") " Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.687282 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.687369 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.687534 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.688223 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.688232 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.691836 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.691865 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-kube-api-access-8dfrv" (OuterVolumeSpecName: "kube-api-access-8dfrv") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "kube-api-access-8dfrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.694295 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.694310 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.694656 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.696792 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.705389 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.715229 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-inventory" (OuterVolumeSpecName: "inventory") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.737757 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cbe8ae7a-0268-477b-a232-fb89a86e6c30" (UID: "cbe8ae7a-0268-477b-a232-fb89a86e6c30"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782519 4628 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782568 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782587 4628 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782605 4628 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782617 4628 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782633 4628 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782644 4628 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782655 4628 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782666 4628 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782677 4628 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782687 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dfrv\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-kube-api-access-8dfrv\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782698 4628 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782708 4628 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbe8ae7a-0268-477b-a232-fb89a86e6c30-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:10 crc kubenswrapper[4628]: I1211 05:48:10.782720 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe8ae7a-0268-477b-a232-fb89a86e6c30-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.150467 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" event={"ID":"cbe8ae7a-0268-477b-a232-fb89a86e6c30","Type":"ContainerDied","Data":"4db2d58638e2c59d83177ddab297b3239af8a88b0dc396d0be7041f2d523d4d6"} Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.150515 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db2d58638e2c59d83177ddab297b3239af8a88b0dc396d0be7041f2d523d4d6" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.150579 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.262549 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds"] Dec 11 05:48:11 crc kubenswrapper[4628]: E1211 05:48:11.265412 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe8ae7a-0268-477b-a232-fb89a86e6c30" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.265506 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe8ae7a-0268-477b-a232-fb89a86e6c30" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.265805 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe8ae7a-0268-477b-a232-fb89a86e6c30" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.266668 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.272743 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.273991 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds"] Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.274128 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.274161 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.274265 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.274269 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.397297 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.397409 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.397481 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.397521 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jhkm\" (UniqueName: \"kubernetes.io/projected/5ab6a157-55db-4fda-8066-c9fee33d98b4-kube-api-access-4jhkm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.397578 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ab6a157-55db-4fda-8066-c9fee33d98b4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.498954 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.499289 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jhkm\" (UniqueName: \"kubernetes.io/projected/5ab6a157-55db-4fda-8066-c9fee33d98b4-kube-api-access-4jhkm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.499483 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ab6a157-55db-4fda-8066-c9fee33d98b4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.500162 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.500238 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.500526 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ab6a157-55db-4fda-8066-c9fee33d98b4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.504376 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.505441 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.512510 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.522400 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jhkm\" (UniqueName: \"kubernetes.io/projected/5ab6a157-55db-4fda-8066-c9fee33d98b4-kube-api-access-4jhkm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-drhds\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:11 crc kubenswrapper[4628]: I1211 05:48:11.581125 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:48:12 crc kubenswrapper[4628]: I1211 05:48:12.114284 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds"] Dec 11 05:48:12 crc kubenswrapper[4628]: W1211 05:48:12.124710 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ab6a157_55db_4fda_8066_c9fee33d98b4.slice/crio-2d8cb283f48bf117b65de8fcca34b92fcb88632854ac3a869b94c72d091c549c WatchSource:0}: Error finding container 2d8cb283f48bf117b65de8fcca34b92fcb88632854ac3a869b94c72d091c549c: Status 404 returned error can't find the container with id 2d8cb283f48bf117b65de8fcca34b92fcb88632854ac3a869b94c72d091c549c Dec 11 05:48:12 crc kubenswrapper[4628]: I1211 05:48:12.163009 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" event={"ID":"5ab6a157-55db-4fda-8066-c9fee33d98b4","Type":"ContainerStarted","Data":"2d8cb283f48bf117b65de8fcca34b92fcb88632854ac3a869b94c72d091c549c"} Dec 11 05:48:13 crc kubenswrapper[4628]: I1211 05:48:13.171184 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" event={"ID":"5ab6a157-55db-4fda-8066-c9fee33d98b4","Type":"ContainerStarted","Data":"4ba81fe2509b2ebb401caf9def82df44ca8ba2ecee997a18563337330a8a4078"} Dec 11 05:48:31 crc kubenswrapper[4628]: I1211 05:48:31.426401 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:48:31 crc kubenswrapper[4628]: I1211 05:48:31.427011 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:49:01 crc kubenswrapper[4628]: I1211 05:49:01.427427 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:49:01 crc kubenswrapper[4628]: I1211 05:49:01.429869 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:49:01 crc kubenswrapper[4628]: I1211 05:49:01.430090 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:49:01 crc kubenswrapper[4628]: I1211 05:49:01.431288 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9cede5e779a453211cb2fa7e39e6adea8ce5ae18e8668e6f7a6e7c92985fdcda"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 05:49:01 crc kubenswrapper[4628]: I1211 05:49:01.431539 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://9cede5e779a453211cb2fa7e39e6adea8ce5ae18e8668e6f7a6e7c92985fdcda" gracePeriod=600 Dec 11 05:49:01 crc kubenswrapper[4628]: I1211 05:49:01.671384 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="9cede5e779a453211cb2fa7e39e6adea8ce5ae18e8668e6f7a6e7c92985fdcda" exitCode=0 Dec 11 05:49:01 crc kubenswrapper[4628]: I1211 05:49:01.671561 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"9cede5e779a453211cb2fa7e39e6adea8ce5ae18e8668e6f7a6e7c92985fdcda"} Dec 11 05:49:01 crc kubenswrapper[4628]: I1211 05:49:01.671878 4628 scope.go:117] "RemoveContainer" containerID="d8f9a511c558fa6e7d9a05bb1fc365d3a5fe729a68755d6a032dacbf916b0c02" Dec 11 05:49:02 crc kubenswrapper[4628]: I1211 05:49:02.684135 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0"} Dec 11 05:49:02 crc kubenswrapper[4628]: I1211 05:49:02.700947 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" podStartSLOduration=51.068162756 podStartE2EDuration="51.700929541s" podCreationTimestamp="2025-12-11 05:48:11 +0000 UTC" firstStartedPulling="2025-12-11 05:48:12.127054685 +0000 UTC m=+1994.544401383" lastFinishedPulling="2025-12-11 05:48:12.75982143 +0000 UTC m=+1995.177168168" observedRunningTime="2025-12-11 05:48:13.190801287 +0000 UTC m=+1995.608147975" watchObservedRunningTime="2025-12-11 05:49:02.700929541 +0000 UTC m=+2045.118276249" Dec 11 05:49:27 crc kubenswrapper[4628]: I1211 05:49:27.886594 4628 generic.go:334] "Generic (PLEG): container finished" podID="5ab6a157-55db-4fda-8066-c9fee33d98b4" containerID="4ba81fe2509b2ebb401caf9def82df44ca8ba2ecee997a18563337330a8a4078" exitCode=0 Dec 11 05:49:27 crc kubenswrapper[4628]: I1211 05:49:27.887003 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" event={"ID":"5ab6a157-55db-4fda-8066-c9fee33d98b4","Type":"ContainerDied","Data":"4ba81fe2509b2ebb401caf9def82df44ca8ba2ecee997a18563337330a8a4078"} Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.478742 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.581059 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jhkm\" (UniqueName: \"kubernetes.io/projected/5ab6a157-55db-4fda-8066-c9fee33d98b4-kube-api-access-4jhkm\") pod \"5ab6a157-55db-4fda-8066-c9fee33d98b4\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.581458 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-ssh-key\") pod \"5ab6a157-55db-4fda-8066-c9fee33d98b4\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.581518 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ab6a157-55db-4fda-8066-c9fee33d98b4-ovncontroller-config-0\") pod \"5ab6a157-55db-4fda-8066-c9fee33d98b4\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.581569 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-inventory\") pod \"5ab6a157-55db-4fda-8066-c9fee33d98b4\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.581590 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-ovn-combined-ca-bundle\") pod \"5ab6a157-55db-4fda-8066-c9fee33d98b4\" (UID: \"5ab6a157-55db-4fda-8066-c9fee33d98b4\") " Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.586759 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab6a157-55db-4fda-8066-c9fee33d98b4-kube-api-access-4jhkm" (OuterVolumeSpecName: "kube-api-access-4jhkm") pod "5ab6a157-55db-4fda-8066-c9fee33d98b4" (UID: "5ab6a157-55db-4fda-8066-c9fee33d98b4"). InnerVolumeSpecName "kube-api-access-4jhkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.591044 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5ab6a157-55db-4fda-8066-c9fee33d98b4" (UID: "5ab6a157-55db-4fda-8066-c9fee33d98b4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.616209 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ab6a157-55db-4fda-8066-c9fee33d98b4-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "5ab6a157-55db-4fda-8066-c9fee33d98b4" (UID: "5ab6a157-55db-4fda-8066-c9fee33d98b4"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.616573 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-inventory" (OuterVolumeSpecName: "inventory") pod "5ab6a157-55db-4fda-8066-c9fee33d98b4" (UID: "5ab6a157-55db-4fda-8066-c9fee33d98b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.618391 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5ab6a157-55db-4fda-8066-c9fee33d98b4" (UID: "5ab6a157-55db-4fda-8066-c9fee33d98b4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.683667 4628 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.683867 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jhkm\" (UniqueName: \"kubernetes.io/projected/5ab6a157-55db-4fda-8066-c9fee33d98b4-kube-api-access-4jhkm\") on node \"crc\" DevicePath \"\"" Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.683930 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.684029 4628 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ab6a157-55db-4fda-8066-c9fee33d98b4-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.684083 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ab6a157-55db-4fda-8066-c9fee33d98b4-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.903329 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" event={"ID":"5ab6a157-55db-4fda-8066-c9fee33d98b4","Type":"ContainerDied","Data":"2d8cb283f48bf117b65de8fcca34b92fcb88632854ac3a869b94c72d091c549c"} Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.903368 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d8cb283f48bf117b65de8fcca34b92fcb88632854ac3a869b94c72d091c549c" Dec 11 05:49:29 crc kubenswrapper[4628]: I1211 05:49:29.903421 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-drhds" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.051701 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7"] Dec 11 05:49:30 crc kubenswrapper[4628]: E1211 05:49:30.052188 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab6a157-55db-4fda-8066-c9fee33d98b4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.052211 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab6a157-55db-4fda-8066-c9fee33d98b4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.052456 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab6a157-55db-4fda-8066-c9fee33d98b4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.053216 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.056114 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.058699 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.060194 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.060419 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.060201 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.062517 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.066655 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7"] Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.193395 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gvwn\" (UniqueName: \"kubernetes.io/projected/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-kube-api-access-9gvwn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.193466 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.193520 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.193786 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.193894 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.193934 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.295875 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.295992 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.296034 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.296058 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.296109 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gvwn\" (UniqueName: \"kubernetes.io/projected/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-kube-api-access-9gvwn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.296145 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.302015 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.303933 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.303941 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.305553 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.312197 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.320307 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gvwn\" (UniqueName: \"kubernetes.io/projected/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-kube-api-access-9gvwn\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.380408 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:49:30 crc kubenswrapper[4628]: I1211 05:49:30.952987 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7"] Dec 11 05:49:31 crc kubenswrapper[4628]: I1211 05:49:31.922820 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" event={"ID":"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6","Type":"ContainerStarted","Data":"ae21ab47e40e103d812d39caeb1730241c9a8dd7a3a5182cc137dbf278c11309"} Dec 11 05:49:31 crc kubenswrapper[4628]: I1211 05:49:31.923189 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" event={"ID":"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6","Type":"ContainerStarted","Data":"c5735663f41d62f3a3de31c23360f9502468e0c58712ca0b56e7605d6c24c14d"} Dec 11 05:49:31 crc kubenswrapper[4628]: I1211 05:49:31.945373 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" podStartSLOduration=1.512217111 podStartE2EDuration="1.945354924s" podCreationTimestamp="2025-12-11 05:49:30 +0000 UTC" firstStartedPulling="2025-12-11 05:49:30.967600641 +0000 UTC m=+2073.384947339" lastFinishedPulling="2025-12-11 05:49:31.400738444 +0000 UTC m=+2073.818085152" observedRunningTime="2025-12-11 05:49:31.943501424 +0000 UTC m=+2074.360848132" watchObservedRunningTime="2025-12-11 05:49:31.945354924 +0000 UTC m=+2074.362701632" Dec 11 05:50:06 crc kubenswrapper[4628]: I1211 05:50:06.689775 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fj5vn"] Dec 11 05:50:06 crc kubenswrapper[4628]: I1211 05:50:06.692997 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:06 crc kubenswrapper[4628]: I1211 05:50:06.702593 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj5vn"] Dec 11 05:50:06 crc kubenswrapper[4628]: I1211 05:50:06.727986 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96bd8e9b-a8ae-4f12-a213-0668a57d89be-utilities\") pod \"redhat-marketplace-fj5vn\" (UID: \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\") " pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:06 crc kubenswrapper[4628]: I1211 05:50:06.728068 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96bd8e9b-a8ae-4f12-a213-0668a57d89be-catalog-content\") pod \"redhat-marketplace-fj5vn\" (UID: \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\") " pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:06 crc kubenswrapper[4628]: I1211 05:50:06.728360 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h42br\" (UniqueName: \"kubernetes.io/projected/96bd8e9b-a8ae-4f12-a213-0668a57d89be-kube-api-access-h42br\") pod \"redhat-marketplace-fj5vn\" (UID: \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\") " pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:06 crc kubenswrapper[4628]: I1211 05:50:06.829783 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h42br\" (UniqueName: \"kubernetes.io/projected/96bd8e9b-a8ae-4f12-a213-0668a57d89be-kube-api-access-h42br\") pod \"redhat-marketplace-fj5vn\" (UID: \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\") " pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:06 crc kubenswrapper[4628]: I1211 05:50:06.829875 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96bd8e9b-a8ae-4f12-a213-0668a57d89be-utilities\") pod \"redhat-marketplace-fj5vn\" (UID: \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\") " pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:06 crc kubenswrapper[4628]: I1211 05:50:06.829898 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96bd8e9b-a8ae-4f12-a213-0668a57d89be-catalog-content\") pod \"redhat-marketplace-fj5vn\" (UID: \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\") " pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:06 crc kubenswrapper[4628]: I1211 05:50:06.830449 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96bd8e9b-a8ae-4f12-a213-0668a57d89be-catalog-content\") pod \"redhat-marketplace-fj5vn\" (UID: \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\") " pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:06 crc kubenswrapper[4628]: I1211 05:50:06.830585 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96bd8e9b-a8ae-4f12-a213-0668a57d89be-utilities\") pod \"redhat-marketplace-fj5vn\" (UID: \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\") " pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:06 crc kubenswrapper[4628]: I1211 05:50:06.850965 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h42br\" (UniqueName: \"kubernetes.io/projected/96bd8e9b-a8ae-4f12-a213-0668a57d89be-kube-api-access-h42br\") pod \"redhat-marketplace-fj5vn\" (UID: \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\") " pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:07 crc kubenswrapper[4628]: I1211 05:50:07.016298 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:07 crc kubenswrapper[4628]: I1211 05:50:07.535933 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj5vn"] Dec 11 05:50:08 crc kubenswrapper[4628]: I1211 05:50:08.245185 4628 generic.go:334] "Generic (PLEG): container finished" podID="96bd8e9b-a8ae-4f12-a213-0668a57d89be" containerID="8a285caddb3f193247b0ccad7081a38c3573a5e4f70d2f10bb507d408d42e666" exitCode=0 Dec 11 05:50:08 crc kubenswrapper[4628]: I1211 05:50:08.245485 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5vn" event={"ID":"96bd8e9b-a8ae-4f12-a213-0668a57d89be","Type":"ContainerDied","Data":"8a285caddb3f193247b0ccad7081a38c3573a5e4f70d2f10bb507d408d42e666"} Dec 11 05:50:08 crc kubenswrapper[4628]: I1211 05:50:08.245511 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5vn" event={"ID":"96bd8e9b-a8ae-4f12-a213-0668a57d89be","Type":"ContainerStarted","Data":"3d571e67aedc3d8a2da4d24ec6d001f89768e8263a712e40b56f22c01f9a8896"} Dec 11 05:50:10 crc kubenswrapper[4628]: I1211 05:50:10.264765 4628 generic.go:334] "Generic (PLEG): container finished" podID="96bd8e9b-a8ae-4f12-a213-0668a57d89be" containerID="c3d613850d899ce279ffd191187f01d0a3d0599d1bda0299fa1af76208a08238" exitCode=0 Dec 11 05:50:10 crc kubenswrapper[4628]: I1211 05:50:10.264825 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5vn" event={"ID":"96bd8e9b-a8ae-4f12-a213-0668a57d89be","Type":"ContainerDied","Data":"c3d613850d899ce279ffd191187f01d0a3d0599d1bda0299fa1af76208a08238"} Dec 11 05:50:12 crc kubenswrapper[4628]: I1211 05:50:12.284262 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5vn" event={"ID":"96bd8e9b-a8ae-4f12-a213-0668a57d89be","Type":"ContainerStarted","Data":"cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0"} Dec 11 05:50:12 crc kubenswrapper[4628]: I1211 05:50:12.303283 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fj5vn" podStartSLOduration=3.135313868 podStartE2EDuration="6.303263182s" podCreationTimestamp="2025-12-11 05:50:06 +0000 UTC" firstStartedPulling="2025-12-11 05:50:08.247559081 +0000 UTC m=+2110.664905779" lastFinishedPulling="2025-12-11 05:50:11.415508385 +0000 UTC m=+2113.832855093" observedRunningTime="2025-12-11 05:50:12.300498238 +0000 UTC m=+2114.717844946" watchObservedRunningTime="2025-12-11 05:50:12.303263182 +0000 UTC m=+2114.720609880" Dec 11 05:50:17 crc kubenswrapper[4628]: I1211 05:50:17.017038 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:17 crc kubenswrapper[4628]: I1211 05:50:17.017717 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:17 crc kubenswrapper[4628]: I1211 05:50:17.083493 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:17 crc kubenswrapper[4628]: I1211 05:50:17.411586 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:17 crc kubenswrapper[4628]: I1211 05:50:17.477210 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj5vn"] Dec 11 05:50:19 crc kubenswrapper[4628]: I1211 05:50:19.346242 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fj5vn" podUID="96bd8e9b-a8ae-4f12-a213-0668a57d89be" containerName="registry-server" containerID="cri-o://cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0" gracePeriod=2 Dec 11 05:50:19 crc kubenswrapper[4628]: I1211 05:50:19.833078 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.007544 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96bd8e9b-a8ae-4f12-a213-0668a57d89be-catalog-content\") pod \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\" (UID: \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\") " Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.007619 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h42br\" (UniqueName: \"kubernetes.io/projected/96bd8e9b-a8ae-4f12-a213-0668a57d89be-kube-api-access-h42br\") pod \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\" (UID: \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\") " Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.007923 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96bd8e9b-a8ae-4f12-a213-0668a57d89be-utilities\") pod \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\" (UID: \"96bd8e9b-a8ae-4f12-a213-0668a57d89be\") " Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.009963 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96bd8e9b-a8ae-4f12-a213-0668a57d89be-utilities" (OuterVolumeSpecName: "utilities") pod "96bd8e9b-a8ae-4f12-a213-0668a57d89be" (UID: "96bd8e9b-a8ae-4f12-a213-0668a57d89be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.024057 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96bd8e9b-a8ae-4f12-a213-0668a57d89be-kube-api-access-h42br" (OuterVolumeSpecName: "kube-api-access-h42br") pod "96bd8e9b-a8ae-4f12-a213-0668a57d89be" (UID: "96bd8e9b-a8ae-4f12-a213-0668a57d89be"). InnerVolumeSpecName "kube-api-access-h42br". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.045665 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96bd8e9b-a8ae-4f12-a213-0668a57d89be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96bd8e9b-a8ae-4f12-a213-0668a57d89be" (UID: "96bd8e9b-a8ae-4f12-a213-0668a57d89be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.110280 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96bd8e9b-a8ae-4f12-a213-0668a57d89be-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.110318 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96bd8e9b-a8ae-4f12-a213-0668a57d89be-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.110331 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h42br\" (UniqueName: \"kubernetes.io/projected/96bd8e9b-a8ae-4f12-a213-0668a57d89be-kube-api-access-h42br\") on node \"crc\" DevicePath \"\"" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.364150 4628 generic.go:334] "Generic (PLEG): container finished" podID="96bd8e9b-a8ae-4f12-a213-0668a57d89be" containerID="cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0" exitCode=0 Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.364194 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5vn" event={"ID":"96bd8e9b-a8ae-4f12-a213-0668a57d89be","Type":"ContainerDied","Data":"cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0"} Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.364509 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fj5vn" event={"ID":"96bd8e9b-a8ae-4f12-a213-0668a57d89be","Type":"ContainerDied","Data":"3d571e67aedc3d8a2da4d24ec6d001f89768e8263a712e40b56f22c01f9a8896"} Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.364532 4628 scope.go:117] "RemoveContainer" containerID="cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.364264 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fj5vn" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.391281 4628 scope.go:117] "RemoveContainer" containerID="c3d613850d899ce279ffd191187f01d0a3d0599d1bda0299fa1af76208a08238" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.405489 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj5vn"] Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.415477 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fj5vn"] Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.421049 4628 scope.go:117] "RemoveContainer" containerID="8a285caddb3f193247b0ccad7081a38c3573a5e4f70d2f10bb507d408d42e666" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.458556 4628 scope.go:117] "RemoveContainer" containerID="cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0" Dec 11 05:50:20 crc kubenswrapper[4628]: E1211 05:50:20.458902 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0\": container with ID starting with cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0 not found: ID does not exist" containerID="cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.458938 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0"} err="failed to get container status \"cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0\": rpc error: code = NotFound desc = could not find container \"cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0\": container with ID starting with cc2e82e82a081cef27e9694baf34c8a523a9f3bac073613fa487ea34153741b0 not found: ID does not exist" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.458962 4628 scope.go:117] "RemoveContainer" containerID="c3d613850d899ce279ffd191187f01d0a3d0599d1bda0299fa1af76208a08238" Dec 11 05:50:20 crc kubenswrapper[4628]: E1211 05:50:20.459188 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d613850d899ce279ffd191187f01d0a3d0599d1bda0299fa1af76208a08238\": container with ID starting with c3d613850d899ce279ffd191187f01d0a3d0599d1bda0299fa1af76208a08238 not found: ID does not exist" containerID="c3d613850d899ce279ffd191187f01d0a3d0599d1bda0299fa1af76208a08238" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.459213 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d613850d899ce279ffd191187f01d0a3d0599d1bda0299fa1af76208a08238"} err="failed to get container status \"c3d613850d899ce279ffd191187f01d0a3d0599d1bda0299fa1af76208a08238\": rpc error: code = NotFound desc = could not find container \"c3d613850d899ce279ffd191187f01d0a3d0599d1bda0299fa1af76208a08238\": container with ID starting with c3d613850d899ce279ffd191187f01d0a3d0599d1bda0299fa1af76208a08238 not found: ID does not exist" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.459232 4628 scope.go:117] "RemoveContainer" containerID="8a285caddb3f193247b0ccad7081a38c3573a5e4f70d2f10bb507d408d42e666" Dec 11 05:50:20 crc kubenswrapper[4628]: E1211 05:50:20.459449 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a285caddb3f193247b0ccad7081a38c3573a5e4f70d2f10bb507d408d42e666\": container with ID starting with 8a285caddb3f193247b0ccad7081a38c3573a5e4f70d2f10bb507d408d42e666 not found: ID does not exist" containerID="8a285caddb3f193247b0ccad7081a38c3573a5e4f70d2f10bb507d408d42e666" Dec 11 05:50:20 crc kubenswrapper[4628]: I1211 05:50:20.459476 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a285caddb3f193247b0ccad7081a38c3573a5e4f70d2f10bb507d408d42e666"} err="failed to get container status \"8a285caddb3f193247b0ccad7081a38c3573a5e4f70d2f10bb507d408d42e666\": rpc error: code = NotFound desc = could not find container \"8a285caddb3f193247b0ccad7081a38c3573a5e4f70d2f10bb507d408d42e666\": container with ID starting with 8a285caddb3f193247b0ccad7081a38c3573a5e4f70d2f10bb507d408d42e666 not found: ID does not exist" Dec 11 05:50:21 crc kubenswrapper[4628]: I1211 05:50:21.902337 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96bd8e9b-a8ae-4f12-a213-0668a57d89be" path="/var/lib/kubelet/pods/96bd8e9b-a8ae-4f12-a213-0668a57d89be/volumes" Dec 11 05:50:26 crc kubenswrapper[4628]: I1211 05:50:26.431677 4628 generic.go:334] "Generic (PLEG): container finished" podID="c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6" containerID="ae21ab47e40e103d812d39caeb1730241c9a8dd7a3a5182cc137dbf278c11309" exitCode=0 Dec 11 05:50:26 crc kubenswrapper[4628]: I1211 05:50:26.431825 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" event={"ID":"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6","Type":"ContainerDied","Data":"ae21ab47e40e103d812d39caeb1730241c9a8dd7a3a5182cc137dbf278c11309"} Dec 11 05:50:27 crc kubenswrapper[4628]: I1211 05:50:27.881723 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:50:27 crc kubenswrapper[4628]: I1211 05:50:27.961777 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-ssh-key\") pod \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " Dec 11 05:50:27 crc kubenswrapper[4628]: I1211 05:50:27.962882 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gvwn\" (UniqueName: \"kubernetes.io/projected/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-kube-api-access-9gvwn\") pod \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " Dec 11 05:50:27 crc kubenswrapper[4628]: I1211 05:50:27.962931 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-inventory\") pod \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " Dec 11 05:50:27 crc kubenswrapper[4628]: I1211 05:50:27.962957 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " Dec 11 05:50:27 crc kubenswrapper[4628]: I1211 05:50:27.962998 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-neutron-metadata-combined-ca-bundle\") pod \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " Dec 11 05:50:27 crc kubenswrapper[4628]: I1211 05:50:27.963221 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-nova-metadata-neutron-config-0\") pod \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\" (UID: \"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6\") " Dec 11 05:50:27 crc kubenswrapper[4628]: I1211 05:50:27.968705 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6" (UID: "c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:50:27 crc kubenswrapper[4628]: I1211 05:50:27.981243 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-kube-api-access-9gvwn" (OuterVolumeSpecName: "kube-api-access-9gvwn") pod "c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6" (UID: "c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6"). InnerVolumeSpecName "kube-api-access-9gvwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:50:27 crc kubenswrapper[4628]: I1211 05:50:27.996491 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-inventory" (OuterVolumeSpecName: "inventory") pod "c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6" (UID: "c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:50:27 crc kubenswrapper[4628]: I1211 05:50:27.996997 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6" (UID: "c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:50:27 crc kubenswrapper[4628]: I1211 05:50:27.998948 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6" (UID: "c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.009268 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6" (UID: "c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.065152 4628 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.065180 4628 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.065190 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.065199 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gvwn\" (UniqueName: \"kubernetes.io/projected/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-kube-api-access-9gvwn\") on node \"crc\" DevicePath \"\"" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.065209 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.065218 4628 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.452186 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" event={"ID":"c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6","Type":"ContainerDied","Data":"c5735663f41d62f3a3de31c23360f9502468e0c58712ca0b56e7605d6c24c14d"} Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.452460 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5735663f41d62f3a3de31c23360f9502468e0c58712ca0b56e7605d6c24c14d" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.452232 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.652318 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw"] Dec 11 05:50:28 crc kubenswrapper[4628]: E1211 05:50:28.653062 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96bd8e9b-a8ae-4f12-a213-0668a57d89be" containerName="extract-content" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.653088 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bd8e9b-a8ae-4f12-a213-0668a57d89be" containerName="extract-content" Dec 11 05:50:28 crc kubenswrapper[4628]: E1211 05:50:28.653104 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96bd8e9b-a8ae-4f12-a213-0668a57d89be" containerName="extract-utilities" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.653115 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bd8e9b-a8ae-4f12-a213-0668a57d89be" containerName="extract-utilities" Dec 11 05:50:28 crc kubenswrapper[4628]: E1211 05:50:28.653132 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96bd8e9b-a8ae-4f12-a213-0668a57d89be" containerName="registry-server" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.653142 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bd8e9b-a8ae-4f12-a213-0668a57d89be" containerName="registry-server" Dec 11 05:50:28 crc kubenswrapper[4628]: E1211 05:50:28.653197 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.653207 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.653426 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="96bd8e9b-a8ae-4f12-a213-0668a57d89be" containerName="registry-server" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.653441 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.654203 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.656203 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.656931 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.657464 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.657629 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.658522 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.673565 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw"] Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.674634 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.674708 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.674745 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.674895 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.674969 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-745dq\" (UniqueName: \"kubernetes.io/projected/10745043-8954-4864-9b9b-d3b2e8614e36-kube-api-access-745dq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.777424 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.777764 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.777871 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.778011 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.778112 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-745dq\" (UniqueName: \"kubernetes.io/projected/10745043-8954-4864-9b9b-d3b2e8614e36-kube-api-access-745dq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.782193 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.782260 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.782637 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.785438 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.800117 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-745dq\" (UniqueName: \"kubernetes.io/projected/10745043-8954-4864-9b9b-d3b2e8614e36-kube-api-access-745dq\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:28 crc kubenswrapper[4628]: I1211 05:50:28.974300 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:50:29 crc kubenswrapper[4628]: I1211 05:50:29.546560 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw"] Dec 11 05:50:30 crc kubenswrapper[4628]: I1211 05:50:30.471709 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" event={"ID":"10745043-8954-4864-9b9b-d3b2e8614e36","Type":"ContainerStarted","Data":"1345efc2980ca452751ee792821e80ad94643886b43b6d9eea6ea81db8fed4a4"} Dec 11 05:50:30 crc kubenswrapper[4628]: I1211 05:50:30.472358 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" event={"ID":"10745043-8954-4864-9b9b-d3b2e8614e36","Type":"ContainerStarted","Data":"ab9dec114d60df6b3d1f784a19b2ecc0f4f8992b03f098995d9d06ddf5a025d6"} Dec 11 05:50:30 crc kubenswrapper[4628]: I1211 05:50:30.495913 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" podStartSLOduration=1.92959511 podStartE2EDuration="2.495889284s" podCreationTimestamp="2025-12-11 05:50:28 +0000 UTC" firstStartedPulling="2025-12-11 05:50:29.554804269 +0000 UTC m=+2131.972150967" lastFinishedPulling="2025-12-11 05:50:30.121098443 +0000 UTC m=+2132.538445141" observedRunningTime="2025-12-11 05:50:30.485227827 +0000 UTC m=+2132.902574525" watchObservedRunningTime="2025-12-11 05:50:30.495889284 +0000 UTC m=+2132.913236002" Dec 11 05:51:01 crc kubenswrapper[4628]: I1211 05:51:01.427153 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:51:01 crc kubenswrapper[4628]: I1211 05:51:01.427748 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:51:16 crc kubenswrapper[4628]: I1211 05:51:16.838233 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jrs74"] Dec 11 05:51:16 crc kubenswrapper[4628]: I1211 05:51:16.851157 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jrs74"] Dec 11 05:51:16 crc kubenswrapper[4628]: I1211 05:51:16.851283 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:16 crc kubenswrapper[4628]: I1211 05:51:16.934193 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dda5112-2ea5-46b4-a3d6-af264ea070b5-utilities\") pod \"certified-operators-jrs74\" (UID: \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\") " pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:16 crc kubenswrapper[4628]: I1211 05:51:16.934257 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78ggt\" (UniqueName: \"kubernetes.io/projected/1dda5112-2ea5-46b4-a3d6-af264ea070b5-kube-api-access-78ggt\") pod \"certified-operators-jrs74\" (UID: \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\") " pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:16 crc kubenswrapper[4628]: I1211 05:51:16.934366 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dda5112-2ea5-46b4-a3d6-af264ea070b5-catalog-content\") pod \"certified-operators-jrs74\" (UID: \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\") " pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:17 crc kubenswrapper[4628]: I1211 05:51:17.035926 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dda5112-2ea5-46b4-a3d6-af264ea070b5-utilities\") pod \"certified-operators-jrs74\" (UID: \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\") " pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:17 crc kubenswrapper[4628]: I1211 05:51:17.035997 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78ggt\" (UniqueName: \"kubernetes.io/projected/1dda5112-2ea5-46b4-a3d6-af264ea070b5-kube-api-access-78ggt\") pod \"certified-operators-jrs74\" (UID: \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\") " pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:17 crc kubenswrapper[4628]: I1211 05:51:17.036126 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dda5112-2ea5-46b4-a3d6-af264ea070b5-catalog-content\") pod \"certified-operators-jrs74\" (UID: \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\") " pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:17 crc kubenswrapper[4628]: I1211 05:51:17.036581 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dda5112-2ea5-46b4-a3d6-af264ea070b5-utilities\") pod \"certified-operators-jrs74\" (UID: \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\") " pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:17 crc kubenswrapper[4628]: I1211 05:51:17.036923 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dda5112-2ea5-46b4-a3d6-af264ea070b5-catalog-content\") pod \"certified-operators-jrs74\" (UID: \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\") " pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:17 crc kubenswrapper[4628]: I1211 05:51:17.066085 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78ggt\" (UniqueName: \"kubernetes.io/projected/1dda5112-2ea5-46b4-a3d6-af264ea070b5-kube-api-access-78ggt\") pod \"certified-operators-jrs74\" (UID: \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\") " pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:17 crc kubenswrapper[4628]: I1211 05:51:17.169772 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:17 crc kubenswrapper[4628]: I1211 05:51:17.718653 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jrs74"] Dec 11 05:51:17 crc kubenswrapper[4628]: I1211 05:51:17.943424 4628 generic.go:334] "Generic (PLEG): container finished" podID="1dda5112-2ea5-46b4-a3d6-af264ea070b5" containerID="8950cb6284257e2c9b44fbf6635be68da007b77402e50502cc6c649d38203cd7" exitCode=0 Dec 11 05:51:17 crc kubenswrapper[4628]: I1211 05:51:17.943808 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrs74" event={"ID":"1dda5112-2ea5-46b4-a3d6-af264ea070b5","Type":"ContainerDied","Data":"8950cb6284257e2c9b44fbf6635be68da007b77402e50502cc6c649d38203cd7"} Dec 11 05:51:17 crc kubenswrapper[4628]: I1211 05:51:17.943894 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrs74" event={"ID":"1dda5112-2ea5-46b4-a3d6-af264ea070b5","Type":"ContainerStarted","Data":"d0bf0ad83fd7c3151c93ca5d4a4d141f3568cb9ef2e8167c6106f5b52a83d0b7"} Dec 11 05:51:17 crc kubenswrapper[4628]: I1211 05:51:17.944955 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 05:51:19 crc kubenswrapper[4628]: I1211 05:51:19.967236 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrs74" event={"ID":"1dda5112-2ea5-46b4-a3d6-af264ea070b5","Type":"ContainerStarted","Data":"4534d557603a5b1abd9a152e89286c28366ab4b3c6734cdcb93f61825e319f55"} Dec 11 05:51:20 crc kubenswrapper[4628]: I1211 05:51:20.981786 4628 generic.go:334] "Generic (PLEG): container finished" podID="1dda5112-2ea5-46b4-a3d6-af264ea070b5" containerID="4534d557603a5b1abd9a152e89286c28366ab4b3c6734cdcb93f61825e319f55" exitCode=0 Dec 11 05:51:20 crc kubenswrapper[4628]: I1211 05:51:20.981909 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrs74" event={"ID":"1dda5112-2ea5-46b4-a3d6-af264ea070b5","Type":"ContainerDied","Data":"4534d557603a5b1abd9a152e89286c28366ab4b3c6734cdcb93f61825e319f55"} Dec 11 05:51:21 crc kubenswrapper[4628]: I1211 05:51:21.995176 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrs74" event={"ID":"1dda5112-2ea5-46b4-a3d6-af264ea070b5","Type":"ContainerStarted","Data":"62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589"} Dec 11 05:51:22 crc kubenswrapper[4628]: I1211 05:51:22.018815 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jrs74" podStartSLOduration=2.523469802 podStartE2EDuration="6.018795951s" podCreationTimestamp="2025-12-11 05:51:16 +0000 UTC" firstStartedPulling="2025-12-11 05:51:17.944709354 +0000 UTC m=+2180.362056052" lastFinishedPulling="2025-12-11 05:51:21.440035503 +0000 UTC m=+2183.857382201" observedRunningTime="2025-12-11 05:51:22.018274427 +0000 UTC m=+2184.435621165" watchObservedRunningTime="2025-12-11 05:51:22.018795951 +0000 UTC m=+2184.436142649" Dec 11 05:51:27 crc kubenswrapper[4628]: I1211 05:51:27.169998 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:27 crc kubenswrapper[4628]: I1211 05:51:27.170491 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:27 crc kubenswrapper[4628]: I1211 05:51:27.217254 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:28 crc kubenswrapper[4628]: I1211 05:51:28.103231 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:28 crc kubenswrapper[4628]: I1211 05:51:28.154323 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jrs74"] Dec 11 05:51:30 crc kubenswrapper[4628]: I1211 05:51:30.089629 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jrs74" podUID="1dda5112-2ea5-46b4-a3d6-af264ea070b5" containerName="registry-server" containerID="cri-o://62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589" gracePeriod=2 Dec 11 05:51:30 crc kubenswrapper[4628]: I1211 05:51:30.558789 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:30 crc kubenswrapper[4628]: I1211 05:51:30.636140 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dda5112-2ea5-46b4-a3d6-af264ea070b5-utilities\") pod \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\" (UID: \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\") " Dec 11 05:51:30 crc kubenswrapper[4628]: I1211 05:51:30.636217 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78ggt\" (UniqueName: \"kubernetes.io/projected/1dda5112-2ea5-46b4-a3d6-af264ea070b5-kube-api-access-78ggt\") pod \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\" (UID: \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\") " Dec 11 05:51:30 crc kubenswrapper[4628]: I1211 05:51:30.636246 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dda5112-2ea5-46b4-a3d6-af264ea070b5-catalog-content\") pod \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\" (UID: \"1dda5112-2ea5-46b4-a3d6-af264ea070b5\") " Dec 11 05:51:30 crc kubenswrapper[4628]: I1211 05:51:30.637133 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dda5112-2ea5-46b4-a3d6-af264ea070b5-utilities" (OuterVolumeSpecName: "utilities") pod "1dda5112-2ea5-46b4-a3d6-af264ea070b5" (UID: "1dda5112-2ea5-46b4-a3d6-af264ea070b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:51:30 crc kubenswrapper[4628]: I1211 05:51:30.644078 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dda5112-2ea5-46b4-a3d6-af264ea070b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:51:30 crc kubenswrapper[4628]: I1211 05:51:30.645148 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dda5112-2ea5-46b4-a3d6-af264ea070b5-kube-api-access-78ggt" (OuterVolumeSpecName: "kube-api-access-78ggt") pod "1dda5112-2ea5-46b4-a3d6-af264ea070b5" (UID: "1dda5112-2ea5-46b4-a3d6-af264ea070b5"). InnerVolumeSpecName "kube-api-access-78ggt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:51:30 crc kubenswrapper[4628]: I1211 05:51:30.685128 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dda5112-2ea5-46b4-a3d6-af264ea070b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1dda5112-2ea5-46b4-a3d6-af264ea070b5" (UID: "1dda5112-2ea5-46b4-a3d6-af264ea070b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:51:30 crc kubenswrapper[4628]: I1211 05:51:30.746064 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78ggt\" (UniqueName: \"kubernetes.io/projected/1dda5112-2ea5-46b4-a3d6-af264ea070b5-kube-api-access-78ggt\") on node \"crc\" DevicePath \"\"" Dec 11 05:51:30 crc kubenswrapper[4628]: I1211 05:51:30.746101 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dda5112-2ea5-46b4-a3d6-af264ea070b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.117221 4628 generic.go:334] "Generic (PLEG): container finished" podID="1dda5112-2ea5-46b4-a3d6-af264ea070b5" containerID="62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589" exitCode=0 Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.117295 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrs74" event={"ID":"1dda5112-2ea5-46b4-a3d6-af264ea070b5","Type":"ContainerDied","Data":"62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589"} Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.117365 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jrs74" event={"ID":"1dda5112-2ea5-46b4-a3d6-af264ea070b5","Type":"ContainerDied","Data":"d0bf0ad83fd7c3151c93ca5d4a4d141f3568cb9ef2e8167c6106f5b52a83d0b7"} Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.117405 4628 scope.go:117] "RemoveContainer" containerID="62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589" Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.117521 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jrs74" Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.166328 4628 scope.go:117] "RemoveContainer" containerID="4534d557603a5b1abd9a152e89286c28366ab4b3c6734cdcb93f61825e319f55" Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.171811 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jrs74"] Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.191998 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jrs74"] Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.202927 4628 scope.go:117] "RemoveContainer" containerID="8950cb6284257e2c9b44fbf6635be68da007b77402e50502cc6c649d38203cd7" Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.258196 4628 scope.go:117] "RemoveContainer" containerID="62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589" Dec 11 05:51:31 crc kubenswrapper[4628]: E1211 05:51:31.258990 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589\": container with ID starting with 62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589 not found: ID does not exist" containerID="62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589" Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.259064 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589"} err="failed to get container status \"62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589\": rpc error: code = NotFound desc = could not find container \"62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589\": container with ID starting with 62abe39b132dfbbe98095edf1105bcb0c17b488582cca7b24699e3301e3b6589 not found: ID does not exist" Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.259112 4628 scope.go:117] "RemoveContainer" containerID="4534d557603a5b1abd9a152e89286c28366ab4b3c6734cdcb93f61825e319f55" Dec 11 05:51:31 crc kubenswrapper[4628]: E1211 05:51:31.259500 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4534d557603a5b1abd9a152e89286c28366ab4b3c6734cdcb93f61825e319f55\": container with ID starting with 4534d557603a5b1abd9a152e89286c28366ab4b3c6734cdcb93f61825e319f55 not found: ID does not exist" containerID="4534d557603a5b1abd9a152e89286c28366ab4b3c6734cdcb93f61825e319f55" Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.259544 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4534d557603a5b1abd9a152e89286c28366ab4b3c6734cdcb93f61825e319f55"} err="failed to get container status \"4534d557603a5b1abd9a152e89286c28366ab4b3c6734cdcb93f61825e319f55\": rpc error: code = NotFound desc = could not find container \"4534d557603a5b1abd9a152e89286c28366ab4b3c6734cdcb93f61825e319f55\": container with ID starting with 4534d557603a5b1abd9a152e89286c28366ab4b3c6734cdcb93f61825e319f55 not found: ID does not exist" Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.259576 4628 scope.go:117] "RemoveContainer" containerID="8950cb6284257e2c9b44fbf6635be68da007b77402e50502cc6c649d38203cd7" Dec 11 05:51:31 crc kubenswrapper[4628]: E1211 05:51:31.259815 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8950cb6284257e2c9b44fbf6635be68da007b77402e50502cc6c649d38203cd7\": container with ID starting with 8950cb6284257e2c9b44fbf6635be68da007b77402e50502cc6c649d38203cd7 not found: ID does not exist" containerID="8950cb6284257e2c9b44fbf6635be68da007b77402e50502cc6c649d38203cd7" Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.259883 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8950cb6284257e2c9b44fbf6635be68da007b77402e50502cc6c649d38203cd7"} err="failed to get container status \"8950cb6284257e2c9b44fbf6635be68da007b77402e50502cc6c649d38203cd7\": rpc error: code = NotFound desc = could not find container \"8950cb6284257e2c9b44fbf6635be68da007b77402e50502cc6c649d38203cd7\": container with ID starting with 8950cb6284257e2c9b44fbf6635be68da007b77402e50502cc6c649d38203cd7 not found: ID does not exist" Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.427127 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.427216 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:51:31 crc kubenswrapper[4628]: I1211 05:51:31.940808 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dda5112-2ea5-46b4-a3d6-af264ea070b5" path="/var/lib/kubelet/pods/1dda5112-2ea5-46b4-a3d6-af264ea070b5/volumes" Dec 11 05:51:49 crc kubenswrapper[4628]: I1211 05:51:49.888226 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hhpm8"] Dec 11 05:51:49 crc kubenswrapper[4628]: E1211 05:51:49.890385 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dda5112-2ea5-46b4-a3d6-af264ea070b5" containerName="extract-content" Dec 11 05:51:49 crc kubenswrapper[4628]: I1211 05:51:49.890485 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dda5112-2ea5-46b4-a3d6-af264ea070b5" containerName="extract-content" Dec 11 05:51:49 crc kubenswrapper[4628]: E1211 05:51:49.890560 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dda5112-2ea5-46b4-a3d6-af264ea070b5" containerName="registry-server" Dec 11 05:51:49 crc kubenswrapper[4628]: I1211 05:51:49.890643 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dda5112-2ea5-46b4-a3d6-af264ea070b5" containerName="registry-server" Dec 11 05:51:49 crc kubenswrapper[4628]: E1211 05:51:49.890740 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dda5112-2ea5-46b4-a3d6-af264ea070b5" containerName="extract-utilities" Dec 11 05:51:49 crc kubenswrapper[4628]: I1211 05:51:49.890831 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dda5112-2ea5-46b4-a3d6-af264ea070b5" containerName="extract-utilities" Dec 11 05:51:49 crc kubenswrapper[4628]: I1211 05:51:49.891180 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dda5112-2ea5-46b4-a3d6-af264ea070b5" containerName="registry-server" Dec 11 05:51:49 crc kubenswrapper[4628]: I1211 05:51:49.893139 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:51:49 crc kubenswrapper[4628]: I1211 05:51:49.933788 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hhpm8"] Dec 11 05:51:49 crc kubenswrapper[4628]: I1211 05:51:49.978037 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw48s\" (UniqueName: \"kubernetes.io/projected/4b1060b3-b635-4700-a2b1-95d1d21c0177-kube-api-access-rw48s\") pod \"redhat-operators-hhpm8\" (UID: \"4b1060b3-b635-4700-a2b1-95d1d21c0177\") " pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:51:49 crc kubenswrapper[4628]: I1211 05:51:49.978124 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1060b3-b635-4700-a2b1-95d1d21c0177-catalog-content\") pod \"redhat-operators-hhpm8\" (UID: \"4b1060b3-b635-4700-a2b1-95d1d21c0177\") " pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:51:49 crc kubenswrapper[4628]: I1211 05:51:49.978206 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1060b3-b635-4700-a2b1-95d1d21c0177-utilities\") pod \"redhat-operators-hhpm8\" (UID: \"4b1060b3-b635-4700-a2b1-95d1d21c0177\") " pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:51:50 crc kubenswrapper[4628]: I1211 05:51:50.080258 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1060b3-b635-4700-a2b1-95d1d21c0177-catalog-content\") pod \"redhat-operators-hhpm8\" (UID: \"4b1060b3-b635-4700-a2b1-95d1d21c0177\") " pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:51:50 crc kubenswrapper[4628]: I1211 05:51:50.080364 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1060b3-b635-4700-a2b1-95d1d21c0177-utilities\") pod \"redhat-operators-hhpm8\" (UID: \"4b1060b3-b635-4700-a2b1-95d1d21c0177\") " pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:51:50 crc kubenswrapper[4628]: I1211 05:51:50.080468 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw48s\" (UniqueName: \"kubernetes.io/projected/4b1060b3-b635-4700-a2b1-95d1d21c0177-kube-api-access-rw48s\") pod \"redhat-operators-hhpm8\" (UID: \"4b1060b3-b635-4700-a2b1-95d1d21c0177\") " pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:51:50 crc kubenswrapper[4628]: I1211 05:51:50.081205 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1060b3-b635-4700-a2b1-95d1d21c0177-catalog-content\") pod \"redhat-operators-hhpm8\" (UID: \"4b1060b3-b635-4700-a2b1-95d1d21c0177\") " pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:51:50 crc kubenswrapper[4628]: I1211 05:51:50.081245 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1060b3-b635-4700-a2b1-95d1d21c0177-utilities\") pod \"redhat-operators-hhpm8\" (UID: \"4b1060b3-b635-4700-a2b1-95d1d21c0177\") " pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:51:50 crc kubenswrapper[4628]: I1211 05:51:50.100648 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw48s\" (UniqueName: \"kubernetes.io/projected/4b1060b3-b635-4700-a2b1-95d1d21c0177-kube-api-access-rw48s\") pod \"redhat-operators-hhpm8\" (UID: \"4b1060b3-b635-4700-a2b1-95d1d21c0177\") " pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:51:50 crc kubenswrapper[4628]: I1211 05:51:50.220143 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:51:50 crc kubenswrapper[4628]: I1211 05:51:50.705524 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hhpm8"] Dec 11 05:51:51 crc kubenswrapper[4628]: I1211 05:51:51.354876 4628 generic.go:334] "Generic (PLEG): container finished" podID="4b1060b3-b635-4700-a2b1-95d1d21c0177" containerID="a79653b01f8d228486160324bb027483bae3819d4a9e4a7ffb6d75d67828a137" exitCode=0 Dec 11 05:51:51 crc kubenswrapper[4628]: I1211 05:51:51.354932 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhpm8" event={"ID":"4b1060b3-b635-4700-a2b1-95d1d21c0177","Type":"ContainerDied","Data":"a79653b01f8d228486160324bb027483bae3819d4a9e4a7ffb6d75d67828a137"} Dec 11 05:51:51 crc kubenswrapper[4628]: I1211 05:51:51.355182 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhpm8" event={"ID":"4b1060b3-b635-4700-a2b1-95d1d21c0177","Type":"ContainerStarted","Data":"3140d1c2f0da28171f1fb1795dd4fa306f08b3ca9058d2ac84145cb708438b0e"} Dec 11 05:51:52 crc kubenswrapper[4628]: I1211 05:51:52.370036 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhpm8" event={"ID":"4b1060b3-b635-4700-a2b1-95d1d21c0177","Type":"ContainerStarted","Data":"b8a032c8bdd2daa0a0a96baf75cdbf94354c7995bcc0c2a96318cd0e268561b0"} Dec 11 05:51:56 crc kubenswrapper[4628]: I1211 05:51:56.409635 4628 generic.go:334] "Generic (PLEG): container finished" podID="4b1060b3-b635-4700-a2b1-95d1d21c0177" containerID="b8a032c8bdd2daa0a0a96baf75cdbf94354c7995bcc0c2a96318cd0e268561b0" exitCode=0 Dec 11 05:51:56 crc kubenswrapper[4628]: I1211 05:51:56.409721 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhpm8" event={"ID":"4b1060b3-b635-4700-a2b1-95d1d21c0177","Type":"ContainerDied","Data":"b8a032c8bdd2daa0a0a96baf75cdbf94354c7995bcc0c2a96318cd0e268561b0"} Dec 11 05:51:57 crc kubenswrapper[4628]: I1211 05:51:57.422482 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhpm8" event={"ID":"4b1060b3-b635-4700-a2b1-95d1d21c0177","Type":"ContainerStarted","Data":"e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3"} Dec 11 05:51:57 crc kubenswrapper[4628]: I1211 05:51:57.453182 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hhpm8" podStartSLOduration=2.871097716 podStartE2EDuration="8.453164746s" podCreationTimestamp="2025-12-11 05:51:49 +0000 UTC" firstStartedPulling="2025-12-11 05:51:51.356189879 +0000 UTC m=+2213.773536577" lastFinishedPulling="2025-12-11 05:51:56.938256909 +0000 UTC m=+2219.355603607" observedRunningTime="2025-12-11 05:51:57.447046361 +0000 UTC m=+2219.864393069" watchObservedRunningTime="2025-12-11 05:51:57.453164746 +0000 UTC m=+2219.870511444" Dec 11 05:52:00 crc kubenswrapper[4628]: I1211 05:52:00.220286 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:52:00 crc kubenswrapper[4628]: I1211 05:52:00.221151 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:52:01 crc kubenswrapper[4628]: I1211 05:52:01.269434 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hhpm8" podUID="4b1060b3-b635-4700-a2b1-95d1d21c0177" containerName="registry-server" probeResult="failure" output=< Dec 11 05:52:01 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 05:52:01 crc kubenswrapper[4628]: > Dec 11 05:52:01 crc kubenswrapper[4628]: I1211 05:52:01.427249 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:52:01 crc kubenswrapper[4628]: I1211 05:52:01.427322 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 05:52:01 crc kubenswrapper[4628]: I1211 05:52:01.427381 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 05:52:01 crc kubenswrapper[4628]: I1211 05:52:01.427970 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 05:52:01 crc kubenswrapper[4628]: I1211 05:52:01.428030 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" gracePeriod=600 Dec 11 05:52:02 crc kubenswrapper[4628]: E1211 05:52:02.051632 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:52:02 crc kubenswrapper[4628]: I1211 05:52:02.485867 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" exitCode=0 Dec 11 05:52:02 crc kubenswrapper[4628]: I1211 05:52:02.485938 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0"} Dec 11 05:52:02 crc kubenswrapper[4628]: I1211 05:52:02.486006 4628 scope.go:117] "RemoveContainer" containerID="9cede5e779a453211cb2fa7e39e6adea8ce5ae18e8668e6f7a6e7c92985fdcda" Dec 11 05:52:02 crc kubenswrapper[4628]: I1211 05:52:02.487442 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:52:02 crc kubenswrapper[4628]: E1211 05:52:02.488322 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:52:10 crc kubenswrapper[4628]: I1211 05:52:10.274229 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:52:10 crc kubenswrapper[4628]: I1211 05:52:10.335559 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:52:10 crc kubenswrapper[4628]: I1211 05:52:10.512961 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hhpm8"] Dec 11 05:52:11 crc kubenswrapper[4628]: I1211 05:52:11.559650 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hhpm8" podUID="4b1060b3-b635-4700-a2b1-95d1d21c0177" containerName="registry-server" containerID="cri-o://e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3" gracePeriod=2 Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.011536 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.132274 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1060b3-b635-4700-a2b1-95d1d21c0177-catalog-content\") pod \"4b1060b3-b635-4700-a2b1-95d1d21c0177\" (UID: \"4b1060b3-b635-4700-a2b1-95d1d21c0177\") " Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.132384 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1060b3-b635-4700-a2b1-95d1d21c0177-utilities\") pod \"4b1060b3-b635-4700-a2b1-95d1d21c0177\" (UID: \"4b1060b3-b635-4700-a2b1-95d1d21c0177\") " Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.132491 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw48s\" (UniqueName: \"kubernetes.io/projected/4b1060b3-b635-4700-a2b1-95d1d21c0177-kube-api-access-rw48s\") pod \"4b1060b3-b635-4700-a2b1-95d1d21c0177\" (UID: \"4b1060b3-b635-4700-a2b1-95d1d21c0177\") " Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.133139 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b1060b3-b635-4700-a2b1-95d1d21c0177-utilities" (OuterVolumeSpecName: "utilities") pod "4b1060b3-b635-4700-a2b1-95d1d21c0177" (UID: "4b1060b3-b635-4700-a2b1-95d1d21c0177"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.138317 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b1060b3-b635-4700-a2b1-95d1d21c0177-kube-api-access-rw48s" (OuterVolumeSpecName: "kube-api-access-rw48s") pod "4b1060b3-b635-4700-a2b1-95d1d21c0177" (UID: "4b1060b3-b635-4700-a2b1-95d1d21c0177"). InnerVolumeSpecName "kube-api-access-rw48s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.238673 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1060b3-b635-4700-a2b1-95d1d21c0177-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.238731 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw48s\" (UniqueName: \"kubernetes.io/projected/4b1060b3-b635-4700-a2b1-95d1d21c0177-kube-api-access-rw48s\") on node \"crc\" DevicePath \"\"" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.246823 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b1060b3-b635-4700-a2b1-95d1d21c0177-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b1060b3-b635-4700-a2b1-95d1d21c0177" (UID: "4b1060b3-b635-4700-a2b1-95d1d21c0177"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.341520 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1060b3-b635-4700-a2b1-95d1d21c0177-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.578125 4628 generic.go:334] "Generic (PLEG): container finished" podID="4b1060b3-b635-4700-a2b1-95d1d21c0177" containerID="e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3" exitCode=0 Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.578196 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhpm8" event={"ID":"4b1060b3-b635-4700-a2b1-95d1d21c0177","Type":"ContainerDied","Data":"e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3"} Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.578244 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hhpm8" event={"ID":"4b1060b3-b635-4700-a2b1-95d1d21c0177","Type":"ContainerDied","Data":"3140d1c2f0da28171f1fb1795dd4fa306f08b3ca9058d2ac84145cb708438b0e"} Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.578244 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hhpm8" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.578267 4628 scope.go:117] "RemoveContainer" containerID="e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.603482 4628 scope.go:117] "RemoveContainer" containerID="b8a032c8bdd2daa0a0a96baf75cdbf94354c7995bcc0c2a96318cd0e268561b0" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.644004 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hhpm8"] Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.657049 4628 scope.go:117] "RemoveContainer" containerID="a79653b01f8d228486160324bb027483bae3819d4a9e4a7ffb6d75d67828a137" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.675825 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hhpm8"] Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.704409 4628 scope.go:117] "RemoveContainer" containerID="e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3" Dec 11 05:52:12 crc kubenswrapper[4628]: E1211 05:52:12.707996 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3\": container with ID starting with e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3 not found: ID does not exist" containerID="e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.708128 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3"} err="failed to get container status \"e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3\": rpc error: code = NotFound desc = could not find container \"e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3\": container with ID starting with e3f2c543862757cecdb592abf93eca39cda04266dbd014fb0618d4bab371a9a3 not found: ID does not exist" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.708215 4628 scope.go:117] "RemoveContainer" containerID="b8a032c8bdd2daa0a0a96baf75cdbf94354c7995bcc0c2a96318cd0e268561b0" Dec 11 05:52:12 crc kubenswrapper[4628]: E1211 05:52:12.708696 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a032c8bdd2daa0a0a96baf75cdbf94354c7995bcc0c2a96318cd0e268561b0\": container with ID starting with b8a032c8bdd2daa0a0a96baf75cdbf94354c7995bcc0c2a96318cd0e268561b0 not found: ID does not exist" containerID="b8a032c8bdd2daa0a0a96baf75cdbf94354c7995bcc0c2a96318cd0e268561b0" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.708776 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a032c8bdd2daa0a0a96baf75cdbf94354c7995bcc0c2a96318cd0e268561b0"} err="failed to get container status \"b8a032c8bdd2daa0a0a96baf75cdbf94354c7995bcc0c2a96318cd0e268561b0\": rpc error: code = NotFound desc = could not find container \"b8a032c8bdd2daa0a0a96baf75cdbf94354c7995bcc0c2a96318cd0e268561b0\": container with ID starting with b8a032c8bdd2daa0a0a96baf75cdbf94354c7995bcc0c2a96318cd0e268561b0 not found: ID does not exist" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.708816 4628 scope.go:117] "RemoveContainer" containerID="a79653b01f8d228486160324bb027483bae3819d4a9e4a7ffb6d75d67828a137" Dec 11 05:52:12 crc kubenswrapper[4628]: E1211 05:52:12.709142 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a79653b01f8d228486160324bb027483bae3819d4a9e4a7ffb6d75d67828a137\": container with ID starting with a79653b01f8d228486160324bb027483bae3819d4a9e4a7ffb6d75d67828a137 not found: ID does not exist" containerID="a79653b01f8d228486160324bb027483bae3819d4a9e4a7ffb6d75d67828a137" Dec 11 05:52:12 crc kubenswrapper[4628]: I1211 05:52:12.709232 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a79653b01f8d228486160324bb027483bae3819d4a9e4a7ffb6d75d67828a137"} err="failed to get container status \"a79653b01f8d228486160324bb027483bae3819d4a9e4a7ffb6d75d67828a137\": rpc error: code = NotFound desc = could not find container \"a79653b01f8d228486160324bb027483bae3819d4a9e4a7ffb6d75d67828a137\": container with ID starting with a79653b01f8d228486160324bb027483bae3819d4a9e4a7ffb6d75d67828a137 not found: ID does not exist" Dec 11 05:52:13 crc kubenswrapper[4628]: I1211 05:52:13.900537 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b1060b3-b635-4700-a2b1-95d1d21c0177" path="/var/lib/kubelet/pods/4b1060b3-b635-4700-a2b1-95d1d21c0177/volumes" Dec 11 05:52:16 crc kubenswrapper[4628]: I1211 05:52:16.890045 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:52:16 crc kubenswrapper[4628]: E1211 05:52:16.890637 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:52:27 crc kubenswrapper[4628]: I1211 05:52:27.900611 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:52:27 crc kubenswrapper[4628]: E1211 05:52:27.901489 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:52:41 crc kubenswrapper[4628]: I1211 05:52:41.889424 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:52:41 crc kubenswrapper[4628]: E1211 05:52:41.890235 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:52:53 crc kubenswrapper[4628]: I1211 05:52:53.890132 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:52:53 crc kubenswrapper[4628]: E1211 05:52:53.891202 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:53:07 crc kubenswrapper[4628]: I1211 05:53:07.899637 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:53:07 crc kubenswrapper[4628]: E1211 05:53:07.900681 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:53:18 crc kubenswrapper[4628]: I1211 05:53:18.888957 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:53:18 crc kubenswrapper[4628]: E1211 05:53:18.889668 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:53:31 crc kubenswrapper[4628]: I1211 05:53:31.890906 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:53:31 crc kubenswrapper[4628]: E1211 05:53:31.892317 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:53:42 crc kubenswrapper[4628]: I1211 05:53:42.890204 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:53:42 crc kubenswrapper[4628]: E1211 05:53:42.890921 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:53:56 crc kubenswrapper[4628]: I1211 05:53:56.889901 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:53:56 crc kubenswrapper[4628]: E1211 05:53:56.890690 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:54:10 crc kubenswrapper[4628]: I1211 05:54:10.889772 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:54:10 crc kubenswrapper[4628]: E1211 05:54:10.890556 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:54:24 crc kubenswrapper[4628]: I1211 05:54:24.890401 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:54:24 crc kubenswrapper[4628]: E1211 05:54:24.891428 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:54:37 crc kubenswrapper[4628]: I1211 05:54:37.896775 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:54:37 crc kubenswrapper[4628]: E1211 05:54:37.897605 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:54:50 crc kubenswrapper[4628]: I1211 05:54:50.890648 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:54:50 crc kubenswrapper[4628]: E1211 05:54:50.891722 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:55:01 crc kubenswrapper[4628]: I1211 05:55:01.200644 4628 generic.go:334] "Generic (PLEG): container finished" podID="10745043-8954-4864-9b9b-d3b2e8614e36" containerID="1345efc2980ca452751ee792821e80ad94643886b43b6d9eea6ea81db8fed4a4" exitCode=0 Dec 11 05:55:01 crc kubenswrapper[4628]: I1211 05:55:01.200837 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" event={"ID":"10745043-8954-4864-9b9b-d3b2e8614e36","Type":"ContainerDied","Data":"1345efc2980ca452751ee792821e80ad94643886b43b6d9eea6ea81db8fed4a4"} Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.638897 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.706534 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-ssh-key\") pod \"10745043-8954-4864-9b9b-d3b2e8614e36\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.706603 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-libvirt-combined-ca-bundle\") pod \"10745043-8954-4864-9b9b-d3b2e8614e36\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.706782 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-libvirt-secret-0\") pod \"10745043-8954-4864-9b9b-d3b2e8614e36\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.707059 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-745dq\" (UniqueName: \"kubernetes.io/projected/10745043-8954-4864-9b9b-d3b2e8614e36-kube-api-access-745dq\") pod \"10745043-8954-4864-9b9b-d3b2e8614e36\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.707189 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-inventory\") pod \"10745043-8954-4864-9b9b-d3b2e8614e36\" (UID: \"10745043-8954-4864-9b9b-d3b2e8614e36\") " Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.712791 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "10745043-8954-4864-9b9b-d3b2e8614e36" (UID: "10745043-8954-4864-9b9b-d3b2e8614e36"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.713793 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10745043-8954-4864-9b9b-d3b2e8614e36-kube-api-access-745dq" (OuterVolumeSpecName: "kube-api-access-745dq") pod "10745043-8954-4864-9b9b-d3b2e8614e36" (UID: "10745043-8954-4864-9b9b-d3b2e8614e36"). InnerVolumeSpecName "kube-api-access-745dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.740936 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "10745043-8954-4864-9b9b-d3b2e8614e36" (UID: "10745043-8954-4864-9b9b-d3b2e8614e36"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.741389 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-inventory" (OuterVolumeSpecName: "inventory") pod "10745043-8954-4864-9b9b-d3b2e8614e36" (UID: "10745043-8954-4864-9b9b-d3b2e8614e36"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.742304 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "10745043-8954-4864-9b9b-d3b2e8614e36" (UID: "10745043-8954-4864-9b9b-d3b2e8614e36"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.810005 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-745dq\" (UniqueName: \"kubernetes.io/projected/10745043-8954-4864-9b9b-d3b2e8614e36-kube-api-access-745dq\") on node \"crc\" DevicePath \"\"" Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.810039 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.810049 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.810057 4628 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.810066 4628 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/10745043-8954-4864-9b9b-d3b2e8614e36-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:55:02 crc kubenswrapper[4628]: I1211 05:55:02.890657 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:55:02 crc kubenswrapper[4628]: E1211 05:55:02.890944 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.235344 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" event={"ID":"10745043-8954-4864-9b9b-d3b2e8614e36","Type":"ContainerDied","Data":"ab9dec114d60df6b3d1f784a19b2ecc0f4f8992b03f098995d9d06ddf5a025d6"} Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.235409 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab9dec114d60df6b3d1f784a19b2ecc0f4f8992b03f098995d9d06ddf5a025d6" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.235452 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.367131 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t"] Dec 11 05:55:03 crc kubenswrapper[4628]: E1211 05:55:03.367750 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1060b3-b635-4700-a2b1-95d1d21c0177" containerName="extract-utilities" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.367770 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1060b3-b635-4700-a2b1-95d1d21c0177" containerName="extract-utilities" Dec 11 05:55:03 crc kubenswrapper[4628]: E1211 05:55:03.367803 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10745043-8954-4864-9b9b-d3b2e8614e36" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.367810 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="10745043-8954-4864-9b9b-d3b2e8614e36" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 11 05:55:03 crc kubenswrapper[4628]: E1211 05:55:03.367818 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1060b3-b635-4700-a2b1-95d1d21c0177" containerName="extract-content" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.367826 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1060b3-b635-4700-a2b1-95d1d21c0177" containerName="extract-content" Dec 11 05:55:03 crc kubenswrapper[4628]: E1211 05:55:03.367858 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1060b3-b635-4700-a2b1-95d1d21c0177" containerName="registry-server" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.367864 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1060b3-b635-4700-a2b1-95d1d21c0177" containerName="registry-server" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.368205 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="10745043-8954-4864-9b9b-d3b2e8614e36" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.368229 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b1060b3-b635-4700-a2b1-95d1d21c0177" containerName="registry-server" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.368793 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.375469 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.377818 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.385455 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.385676 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.385907 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.386067 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.386271 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.392654 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t"] Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.457128 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.457183 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.457221 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4b2f\" (UniqueName: \"kubernetes.io/projected/1b0b9e64-e4c3-4250-ae8d-319461717fcd-kube-api-access-g4b2f\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.457263 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.457360 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.457535 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.457608 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.457736 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.457762 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.560298 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.560381 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.560450 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4b2f\" (UniqueName: \"kubernetes.io/projected/1b0b9e64-e4c3-4250-ae8d-319461717fcd-kube-api-access-g4b2f\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.560526 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.560554 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.560646 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.560709 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.560796 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.560853 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.562633 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.564744 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.566554 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.566604 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.567356 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.567472 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.567773 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.576033 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.586314 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4b2f\" (UniqueName: \"kubernetes.io/projected/1b0b9e64-e4c3-4250-ae8d-319461717fcd-kube-api-access-g4b2f\") pod \"nova-edpm-deployment-openstack-edpm-ipam-fz78t\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:03 crc kubenswrapper[4628]: I1211 05:55:03.730049 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:55:04 crc kubenswrapper[4628]: I1211 05:55:04.357483 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t"] Dec 11 05:55:05 crc kubenswrapper[4628]: I1211 05:55:05.259995 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" event={"ID":"1b0b9e64-e4c3-4250-ae8d-319461717fcd","Type":"ContainerStarted","Data":"2f970bd81d4151c3007d044c94aa8133ab4057b7cf4c3625a063d3d0b6a4603a"} Dec 11 05:55:05 crc kubenswrapper[4628]: I1211 05:55:05.260439 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" event={"ID":"1b0b9e64-e4c3-4250-ae8d-319461717fcd","Type":"ContainerStarted","Data":"485d7046175687347c71c9dde7c25137e8c81349c4b83812092cbb01cb1095be"} Dec 11 05:55:05 crc kubenswrapper[4628]: I1211 05:55:05.287455 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" podStartSLOduration=1.790591488 podStartE2EDuration="2.28743627s" podCreationTimestamp="2025-12-11 05:55:03 +0000 UTC" firstStartedPulling="2025-12-11 05:55:04.370533172 +0000 UTC m=+2406.787879870" lastFinishedPulling="2025-12-11 05:55:04.867377914 +0000 UTC m=+2407.284724652" observedRunningTime="2025-12-11 05:55:05.28337836 +0000 UTC m=+2407.700725058" watchObservedRunningTime="2025-12-11 05:55:05.28743627 +0000 UTC m=+2407.704782978" Dec 11 05:55:14 crc kubenswrapper[4628]: I1211 05:55:14.890578 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:55:14 crc kubenswrapper[4628]: E1211 05:55:14.891309 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:55:25 crc kubenswrapper[4628]: I1211 05:55:25.889646 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:55:25 crc kubenswrapper[4628]: E1211 05:55:25.890900 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:55:38 crc kubenswrapper[4628]: I1211 05:55:38.890409 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:55:38 crc kubenswrapper[4628]: E1211 05:55:38.891675 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:55:51 crc kubenswrapper[4628]: I1211 05:55:51.895597 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:55:51 crc kubenswrapper[4628]: E1211 05:55:51.896319 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:56:05 crc kubenswrapper[4628]: I1211 05:56:05.889475 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:56:05 crc kubenswrapper[4628]: E1211 05:56:05.890304 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:56:19 crc kubenswrapper[4628]: I1211 05:56:19.890270 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:56:19 crc kubenswrapper[4628]: E1211 05:56:19.891434 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:56:30 crc kubenswrapper[4628]: I1211 05:56:30.889280 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:56:30 crc kubenswrapper[4628]: E1211 05:56:30.891613 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:56:41 crc kubenswrapper[4628]: I1211 05:56:41.891165 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:56:41 crc kubenswrapper[4628]: E1211 05:56:41.892184 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:56:54 crc kubenswrapper[4628]: I1211 05:56:54.889699 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:56:54 crc kubenswrapper[4628]: E1211 05:56:54.890434 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 05:57:08 crc kubenswrapper[4628]: I1211 05:57:08.889495 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 05:57:09 crc kubenswrapper[4628]: I1211 05:57:09.702581 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"522036b9d0739c4f5dc7b8387fb0052860bb6519261fec1597dd6379d1a48f3b"} Dec 11 05:58:17 crc kubenswrapper[4628]: I1211 05:58:17.339372 4628 generic.go:334] "Generic (PLEG): container finished" podID="1b0b9e64-e4c3-4250-ae8d-319461717fcd" containerID="2f970bd81d4151c3007d044c94aa8133ab4057b7cf4c3625a063d3d0b6a4603a" exitCode=0 Dec 11 05:58:17 crc kubenswrapper[4628]: I1211 05:58:17.339466 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" event={"ID":"1b0b9e64-e4c3-4250-ae8d-319461717fcd","Type":"ContainerDied","Data":"2f970bd81d4151c3007d044c94aa8133ab4057b7cf4c3625a063d3d0b6a4603a"} Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.786199 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.819405 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-cell1-compute-config-0\") pod \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.819509 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-migration-ssh-key-0\") pod \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.819573 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4b2f\" (UniqueName: \"kubernetes.io/projected/1b0b9e64-e4c3-4250-ae8d-319461717fcd-kube-api-access-g4b2f\") pod \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.819618 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-combined-ca-bundle\") pod \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.819661 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-cell1-compute-config-1\") pod \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.819738 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-migration-ssh-key-1\") pod \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.819762 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-ssh-key\") pod \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.819868 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-inventory\") pod \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.819939 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-extra-config-0\") pod \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\" (UID: \"1b0b9e64-e4c3-4250-ae8d-319461717fcd\") " Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.834036 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "1b0b9e64-e4c3-4250-ae8d-319461717fcd" (UID: "1b0b9e64-e4c3-4250-ae8d-319461717fcd"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.853501 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b0b9e64-e4c3-4250-ae8d-319461717fcd-kube-api-access-g4b2f" (OuterVolumeSpecName: "kube-api-access-g4b2f") pod "1b0b9e64-e4c3-4250-ae8d-319461717fcd" (UID: "1b0b9e64-e4c3-4250-ae8d-319461717fcd"). InnerVolumeSpecName "kube-api-access-g4b2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.860271 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-inventory" (OuterVolumeSpecName: "inventory") pod "1b0b9e64-e4c3-4250-ae8d-319461717fcd" (UID: "1b0b9e64-e4c3-4250-ae8d-319461717fcd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.864241 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "1b0b9e64-e4c3-4250-ae8d-319461717fcd" (UID: "1b0b9e64-e4c3-4250-ae8d-319461717fcd"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.878769 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "1b0b9e64-e4c3-4250-ae8d-319461717fcd" (UID: "1b0b9e64-e4c3-4250-ae8d-319461717fcd"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.883749 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1b0b9e64-e4c3-4250-ae8d-319461717fcd" (UID: "1b0b9e64-e4c3-4250-ae8d-319461717fcd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.898691 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "1b0b9e64-e4c3-4250-ae8d-319461717fcd" (UID: "1b0b9e64-e4c3-4250-ae8d-319461717fcd"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.901613 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "1b0b9e64-e4c3-4250-ae8d-319461717fcd" (UID: "1b0b9e64-e4c3-4250-ae8d-319461717fcd"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.903204 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "1b0b9e64-e4c3-4250-ae8d-319461717fcd" (UID: "1b0b9e64-e4c3-4250-ae8d-319461717fcd"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.922818 4628 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.922874 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.922888 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.922901 4628 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.922913 4628 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.922925 4628 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.922936 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4b2f\" (UniqueName: \"kubernetes.io/projected/1b0b9e64-e4c3-4250-ae8d-319461717fcd-kube-api-access-g4b2f\") on node \"crc\" DevicePath \"\"" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.922946 4628 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 05:58:18 crc kubenswrapper[4628]: I1211 05:58:18.922956 4628 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1b0b9e64-e4c3-4250-ae8d-319461717fcd-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.357054 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" event={"ID":"1b0b9e64-e4c3-4250-ae8d-319461717fcd","Type":"ContainerDied","Data":"485d7046175687347c71c9dde7c25137e8c81349c4b83812092cbb01cb1095be"} Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.357090 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="485d7046175687347c71c9dde7c25137e8c81349c4b83812092cbb01cb1095be" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.357144 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-fz78t" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.539268 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db"] Dec 11 05:58:19 crc kubenswrapper[4628]: E1211 05:58:19.539675 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b0b9e64-e4c3-4250-ae8d-319461717fcd" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.539693 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0b9e64-e4c3-4250-ae8d-319461717fcd" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.539901 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b0b9e64-e4c3-4250-ae8d-319461717fcd" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.540528 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.543169 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.543552 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.543820 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-t5hzf" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.544316 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.556064 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.561829 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db"] Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.635925 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.636001 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.636064 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.636090 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.636123 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.636298 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.636471 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t424\" (UniqueName: \"kubernetes.io/projected/70e52eb8-3a47-4192-9d87-3178a99becfe-kube-api-access-6t424\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.738492 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.738896 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.738944 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.738974 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.739021 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.739054 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t424\" (UniqueName: \"kubernetes.io/projected/70e52eb8-3a47-4192-9d87-3178a99becfe-kube-api-access-6t424\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.739095 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.743676 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.744329 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.744775 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.745243 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.748949 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.750295 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.767689 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t424\" (UniqueName: \"kubernetes.io/projected/70e52eb8-3a47-4192-9d87-3178a99becfe-kube-api-access-6t424\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-f28db\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:19 crc kubenswrapper[4628]: I1211 05:58:19.880337 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 05:58:20 crc kubenswrapper[4628]: I1211 05:58:20.528578 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db"] Dec 11 05:58:20 crc kubenswrapper[4628]: W1211 05:58:20.537930 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70e52eb8_3a47_4192_9d87_3178a99becfe.slice/crio-c0f3ee33017d13b43616c1b0e41af5586b950b81773191ee78ad8cee9b98ce74 WatchSource:0}: Error finding container c0f3ee33017d13b43616c1b0e41af5586b950b81773191ee78ad8cee9b98ce74: Status 404 returned error can't find the container with id c0f3ee33017d13b43616c1b0e41af5586b950b81773191ee78ad8cee9b98ce74 Dec 11 05:58:20 crc kubenswrapper[4628]: I1211 05:58:20.541129 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 05:58:21 crc kubenswrapper[4628]: I1211 05:58:21.375782 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" event={"ID":"70e52eb8-3a47-4192-9d87-3178a99becfe","Type":"ContainerStarted","Data":"c0f3ee33017d13b43616c1b0e41af5586b950b81773191ee78ad8cee9b98ce74"} Dec 11 05:58:22 crc kubenswrapper[4628]: I1211 05:58:22.388899 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" event={"ID":"70e52eb8-3a47-4192-9d87-3178a99becfe","Type":"ContainerStarted","Data":"0995474904b5a1d378a24237e02b85c718e082742b040bf2d7bfef54d21966e1"} Dec 11 05:58:22 crc kubenswrapper[4628]: I1211 05:58:22.412905 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" podStartSLOduration=2.734074604 podStartE2EDuration="3.412885235s" podCreationTimestamp="2025-12-11 05:58:19 +0000 UTC" firstStartedPulling="2025-12-11 05:58:20.54091126 +0000 UTC m=+2602.958257958" lastFinishedPulling="2025-12-11 05:58:21.219721891 +0000 UTC m=+2603.637068589" observedRunningTime="2025-12-11 05:58:22.406066961 +0000 UTC m=+2604.823413669" watchObservedRunningTime="2025-12-11 05:58:22.412885235 +0000 UTC m=+2604.830231943" Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.136873 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n2b5l"] Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.140991 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.149209 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2b5l"] Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.248376 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch274\" (UniqueName: \"kubernetes.io/projected/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-kube-api-access-ch274\") pod \"community-operators-n2b5l\" (UID: \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\") " pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.248538 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-utilities\") pod \"community-operators-n2b5l\" (UID: \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\") " pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.248572 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-catalog-content\") pod \"community-operators-n2b5l\" (UID: \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\") " pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.350800 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-catalog-content\") pod \"community-operators-n2b5l\" (UID: \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\") " pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.351001 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch274\" (UniqueName: \"kubernetes.io/projected/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-kube-api-access-ch274\") pod \"community-operators-n2b5l\" (UID: \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\") " pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.351297 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-catalog-content\") pod \"community-operators-n2b5l\" (UID: \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\") " pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.351526 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-utilities\") pod \"community-operators-n2b5l\" (UID: \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\") " pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.351921 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-utilities\") pod \"community-operators-n2b5l\" (UID: \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\") " pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.372552 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch274\" (UniqueName: \"kubernetes.io/projected/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-kube-api-access-ch274\") pod \"community-operators-n2b5l\" (UID: \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\") " pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:24 crc kubenswrapper[4628]: I1211 05:58:24.483489 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:25 crc kubenswrapper[4628]: W1211 05:58:25.081411 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f8bbe3_c624_49dd_9ba6_6fb09b84e855.slice/crio-d98d63dac6c5a7c4106603287163abcff7eb9bad7b8bfa75e0064e28af652ef8 WatchSource:0}: Error finding container d98d63dac6c5a7c4106603287163abcff7eb9bad7b8bfa75e0064e28af652ef8: Status 404 returned error can't find the container with id d98d63dac6c5a7c4106603287163abcff7eb9bad7b8bfa75e0064e28af652ef8 Dec 11 05:58:25 crc kubenswrapper[4628]: I1211 05:58:25.101920 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2b5l"] Dec 11 05:58:25 crc kubenswrapper[4628]: I1211 05:58:25.416314 4628 generic.go:334] "Generic (PLEG): container finished" podID="f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" containerID="7acc72a5fd221be10391e1b78207ad9f1f764b90f81770263b9c1cb5e3c06f34" exitCode=0 Dec 11 05:58:25 crc kubenswrapper[4628]: I1211 05:58:25.417311 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2b5l" event={"ID":"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855","Type":"ContainerDied","Data":"7acc72a5fd221be10391e1b78207ad9f1f764b90f81770263b9c1cb5e3c06f34"} Dec 11 05:58:25 crc kubenswrapper[4628]: I1211 05:58:25.417429 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2b5l" event={"ID":"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855","Type":"ContainerStarted","Data":"d98d63dac6c5a7c4106603287163abcff7eb9bad7b8bfa75e0064e28af652ef8"} Dec 11 05:58:26 crc kubenswrapper[4628]: I1211 05:58:26.429948 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2b5l" event={"ID":"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855","Type":"ContainerStarted","Data":"fb9e007ab16ff1d89731d0b1be27a75e23900428ab8e0afc20f65bf722c6cf45"} Dec 11 05:58:28 crc kubenswrapper[4628]: I1211 05:58:28.457003 4628 generic.go:334] "Generic (PLEG): container finished" podID="f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" containerID="fb9e007ab16ff1d89731d0b1be27a75e23900428ab8e0afc20f65bf722c6cf45" exitCode=0 Dec 11 05:58:28 crc kubenswrapper[4628]: I1211 05:58:28.457055 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2b5l" event={"ID":"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855","Type":"ContainerDied","Data":"fb9e007ab16ff1d89731d0b1be27a75e23900428ab8e0afc20f65bf722c6cf45"} Dec 11 05:58:29 crc kubenswrapper[4628]: I1211 05:58:29.467387 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2b5l" event={"ID":"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855","Type":"ContainerStarted","Data":"87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652"} Dec 11 05:58:29 crc kubenswrapper[4628]: I1211 05:58:29.490205 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n2b5l" podStartSLOduration=1.802056795 podStartE2EDuration="5.490187721s" podCreationTimestamp="2025-12-11 05:58:24 +0000 UTC" firstStartedPulling="2025-12-11 05:58:25.419460586 +0000 UTC m=+2607.836807284" lastFinishedPulling="2025-12-11 05:58:29.107591462 +0000 UTC m=+2611.524938210" observedRunningTime="2025-12-11 05:58:29.4849612 +0000 UTC m=+2611.902307898" watchObservedRunningTime="2025-12-11 05:58:29.490187721 +0000 UTC m=+2611.907534419" Dec 11 05:58:34 crc kubenswrapper[4628]: I1211 05:58:34.483752 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:34 crc kubenswrapper[4628]: I1211 05:58:34.485211 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:34 crc kubenswrapper[4628]: I1211 05:58:34.546350 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:34 crc kubenswrapper[4628]: I1211 05:58:34.603692 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:34 crc kubenswrapper[4628]: I1211 05:58:34.788268 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n2b5l"] Dec 11 05:58:36 crc kubenswrapper[4628]: I1211 05:58:36.548438 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n2b5l" podUID="f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" containerName="registry-server" containerID="cri-o://87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652" gracePeriod=2 Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.011228 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.152497 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-utilities\") pod \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\" (UID: \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\") " Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.153278 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch274\" (UniqueName: \"kubernetes.io/projected/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-kube-api-access-ch274\") pod \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\" (UID: \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\") " Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.153396 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-catalog-content\") pod \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\" (UID: \"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855\") " Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.154453 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-utilities" (OuterVolumeSpecName: "utilities") pod "f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" (UID: "f6f8bbe3-c624-49dd-9ba6-6fb09b84e855"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.154675 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.185148 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-kube-api-access-ch274" (OuterVolumeSpecName: "kube-api-access-ch274") pod "f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" (UID: "f6f8bbe3-c624-49dd-9ba6-6fb09b84e855"). InnerVolumeSpecName "kube-api-access-ch274". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.215659 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" (UID: "f6f8bbe3-c624-49dd-9ba6-6fb09b84e855"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.256651 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch274\" (UniqueName: \"kubernetes.io/projected/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-kube-api-access-ch274\") on node \"crc\" DevicePath \"\"" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.256699 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.557295 4628 generic.go:334] "Generic (PLEG): container finished" podID="f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" containerID="87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652" exitCode=0 Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.557361 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2b5l" event={"ID":"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855","Type":"ContainerDied","Data":"87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652"} Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.557413 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2b5l" event={"ID":"f6f8bbe3-c624-49dd-9ba6-6fb09b84e855","Type":"ContainerDied","Data":"d98d63dac6c5a7c4106603287163abcff7eb9bad7b8bfa75e0064e28af652ef8"} Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.557434 4628 scope.go:117] "RemoveContainer" containerID="87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.558260 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2b5l" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.576658 4628 scope.go:117] "RemoveContainer" containerID="fb9e007ab16ff1d89731d0b1be27a75e23900428ab8e0afc20f65bf722c6cf45" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.596805 4628 scope.go:117] "RemoveContainer" containerID="7acc72a5fd221be10391e1b78207ad9f1f764b90f81770263b9c1cb5e3c06f34" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.662586 4628 scope.go:117] "RemoveContainer" containerID="87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652" Dec 11 05:58:37 crc kubenswrapper[4628]: E1211 05:58:37.664230 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652\": container with ID starting with 87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652 not found: ID does not exist" containerID="87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.664331 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652"} err="failed to get container status \"87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652\": rpc error: code = NotFound desc = could not find container \"87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652\": container with ID starting with 87a657f989e9b0eb1b403d60bf4c56db48b457e0656b9763544f3deeb0b1d652 not found: ID does not exist" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.664497 4628 scope.go:117] "RemoveContainer" containerID="fb9e007ab16ff1d89731d0b1be27a75e23900428ab8e0afc20f65bf722c6cf45" Dec 11 05:58:37 crc kubenswrapper[4628]: E1211 05:58:37.666938 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9e007ab16ff1d89731d0b1be27a75e23900428ab8e0afc20f65bf722c6cf45\": container with ID starting with fb9e007ab16ff1d89731d0b1be27a75e23900428ab8e0afc20f65bf722c6cf45 not found: ID does not exist" containerID="fb9e007ab16ff1d89731d0b1be27a75e23900428ab8e0afc20f65bf722c6cf45" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.666985 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9e007ab16ff1d89731d0b1be27a75e23900428ab8e0afc20f65bf722c6cf45"} err="failed to get container status \"fb9e007ab16ff1d89731d0b1be27a75e23900428ab8e0afc20f65bf722c6cf45\": rpc error: code = NotFound desc = could not find container \"fb9e007ab16ff1d89731d0b1be27a75e23900428ab8e0afc20f65bf722c6cf45\": container with ID starting with fb9e007ab16ff1d89731d0b1be27a75e23900428ab8e0afc20f65bf722c6cf45 not found: ID does not exist" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.667013 4628 scope.go:117] "RemoveContainer" containerID="7acc72a5fd221be10391e1b78207ad9f1f764b90f81770263b9c1cb5e3c06f34" Dec 11 05:58:37 crc kubenswrapper[4628]: E1211 05:58:37.667409 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7acc72a5fd221be10391e1b78207ad9f1f764b90f81770263b9c1cb5e3c06f34\": container with ID starting with 7acc72a5fd221be10391e1b78207ad9f1f764b90f81770263b9c1cb5e3c06f34 not found: ID does not exist" containerID="7acc72a5fd221be10391e1b78207ad9f1f764b90f81770263b9c1cb5e3c06f34" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.667448 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7acc72a5fd221be10391e1b78207ad9f1f764b90f81770263b9c1cb5e3c06f34"} err="failed to get container status \"7acc72a5fd221be10391e1b78207ad9f1f764b90f81770263b9c1cb5e3c06f34\": rpc error: code = NotFound desc = could not find container \"7acc72a5fd221be10391e1b78207ad9f1f764b90f81770263b9c1cb5e3c06f34\": container with ID starting with 7acc72a5fd221be10391e1b78207ad9f1f764b90f81770263b9c1cb5e3c06f34 not found: ID does not exist" Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.684316 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n2b5l"] Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.694829 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n2b5l"] Dec 11 05:58:37 crc kubenswrapper[4628]: I1211 05:58:37.902831 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" path="/var/lib/kubelet/pods/f6f8bbe3-c624-49dd-9ba6-6fb09b84e855/volumes" Dec 11 05:59:31 crc kubenswrapper[4628]: I1211 05:59:31.427341 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 05:59:31 crc kubenswrapper[4628]: I1211 05:59:31.428263 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.167346 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4"] Dec 11 06:00:00 crc kubenswrapper[4628]: E1211 06:00:00.174074 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" containerName="extract-content" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.174104 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" containerName="extract-content" Dec 11 06:00:00 crc kubenswrapper[4628]: E1211 06:00:00.174161 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" containerName="registry-server" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.174172 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" containerName="registry-server" Dec 11 06:00:00 crc kubenswrapper[4628]: E1211 06:00:00.174238 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" containerName="extract-utilities" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.174252 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" containerName="extract-utilities" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.185555 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f8bbe3-c624-49dd-9ba6-6fb09b84e855" containerName="registry-server" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.187147 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.190980 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.192648 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.219777 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4"] Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.269205 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr97r\" (UniqueName: \"kubernetes.io/projected/c5892c32-a678-4f96-aaa9-03f39b6fe036-kube-api-access-jr97r\") pod \"collect-profiles-29423880-6t2x4\" (UID: \"c5892c32-a678-4f96-aaa9-03f39b6fe036\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.269598 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5892c32-a678-4f96-aaa9-03f39b6fe036-config-volume\") pod \"collect-profiles-29423880-6t2x4\" (UID: \"c5892c32-a678-4f96-aaa9-03f39b6fe036\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.269700 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5892c32-a678-4f96-aaa9-03f39b6fe036-secret-volume\") pod \"collect-profiles-29423880-6t2x4\" (UID: \"c5892c32-a678-4f96-aaa9-03f39b6fe036\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.371541 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5892c32-a678-4f96-aaa9-03f39b6fe036-config-volume\") pod \"collect-profiles-29423880-6t2x4\" (UID: \"c5892c32-a678-4f96-aaa9-03f39b6fe036\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.371597 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5892c32-a678-4f96-aaa9-03f39b6fe036-secret-volume\") pod \"collect-profiles-29423880-6t2x4\" (UID: \"c5892c32-a678-4f96-aaa9-03f39b6fe036\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.371623 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr97r\" (UniqueName: \"kubernetes.io/projected/c5892c32-a678-4f96-aaa9-03f39b6fe036-kube-api-access-jr97r\") pod \"collect-profiles-29423880-6t2x4\" (UID: \"c5892c32-a678-4f96-aaa9-03f39b6fe036\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.373146 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5892c32-a678-4f96-aaa9-03f39b6fe036-config-volume\") pod \"collect-profiles-29423880-6t2x4\" (UID: \"c5892c32-a678-4f96-aaa9-03f39b6fe036\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.392468 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5892c32-a678-4f96-aaa9-03f39b6fe036-secret-volume\") pod \"collect-profiles-29423880-6t2x4\" (UID: \"c5892c32-a678-4f96-aaa9-03f39b6fe036\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.392744 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr97r\" (UniqueName: \"kubernetes.io/projected/c5892c32-a678-4f96-aaa9-03f39b6fe036-kube-api-access-jr97r\") pod \"collect-profiles-29423880-6t2x4\" (UID: \"c5892c32-a678-4f96-aaa9-03f39b6fe036\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:00 crc kubenswrapper[4628]: I1211 06:00:00.544259 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:01 crc kubenswrapper[4628]: I1211 06:00:01.043206 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4"] Dec 11 06:00:01 crc kubenswrapper[4628]: I1211 06:00:01.421006 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" event={"ID":"c5892c32-a678-4f96-aaa9-03f39b6fe036","Type":"ContainerDied","Data":"5fab9aa013bea59abbe59841a20d9131aa481616c58edc44e60748ae722774c6"} Dec 11 06:00:01 crc kubenswrapper[4628]: I1211 06:00:01.420825 4628 generic.go:334] "Generic (PLEG): container finished" podID="c5892c32-a678-4f96-aaa9-03f39b6fe036" containerID="5fab9aa013bea59abbe59841a20d9131aa481616c58edc44e60748ae722774c6" exitCode=0 Dec 11 06:00:01 crc kubenswrapper[4628]: I1211 06:00:01.422681 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" event={"ID":"c5892c32-a678-4f96-aaa9-03f39b6fe036","Type":"ContainerStarted","Data":"0e11887493322a0c3c770b35caf4e8768e83cdd05cfcdb51e88c300d931ae49d"} Dec 11 06:00:01 crc kubenswrapper[4628]: I1211 06:00:01.426429 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:00:01 crc kubenswrapper[4628]: I1211 06:00:01.426612 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:00:02 crc kubenswrapper[4628]: I1211 06:00:02.794617 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:02 crc kubenswrapper[4628]: I1211 06:00:02.925022 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5892c32-a678-4f96-aaa9-03f39b6fe036-secret-volume\") pod \"c5892c32-a678-4f96-aaa9-03f39b6fe036\" (UID: \"c5892c32-a678-4f96-aaa9-03f39b6fe036\") " Dec 11 06:00:02 crc kubenswrapper[4628]: I1211 06:00:02.925159 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr97r\" (UniqueName: \"kubernetes.io/projected/c5892c32-a678-4f96-aaa9-03f39b6fe036-kube-api-access-jr97r\") pod \"c5892c32-a678-4f96-aaa9-03f39b6fe036\" (UID: \"c5892c32-a678-4f96-aaa9-03f39b6fe036\") " Dec 11 06:00:02 crc kubenswrapper[4628]: I1211 06:00:02.925219 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5892c32-a678-4f96-aaa9-03f39b6fe036-config-volume\") pod \"c5892c32-a678-4f96-aaa9-03f39b6fe036\" (UID: \"c5892c32-a678-4f96-aaa9-03f39b6fe036\") " Dec 11 06:00:02 crc kubenswrapper[4628]: I1211 06:00:02.927880 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5892c32-a678-4f96-aaa9-03f39b6fe036-config-volume" (OuterVolumeSpecName: "config-volume") pod "c5892c32-a678-4f96-aaa9-03f39b6fe036" (UID: "c5892c32-a678-4f96-aaa9-03f39b6fe036"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 06:00:02 crc kubenswrapper[4628]: I1211 06:00:02.956090 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5892c32-a678-4f96-aaa9-03f39b6fe036-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c5892c32-a678-4f96-aaa9-03f39b6fe036" (UID: "c5892c32-a678-4f96-aaa9-03f39b6fe036"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:00:02 crc kubenswrapper[4628]: I1211 06:00:02.956183 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5892c32-a678-4f96-aaa9-03f39b6fe036-kube-api-access-jr97r" (OuterVolumeSpecName: "kube-api-access-jr97r") pod "c5892c32-a678-4f96-aaa9-03f39b6fe036" (UID: "c5892c32-a678-4f96-aaa9-03f39b6fe036"). InnerVolumeSpecName "kube-api-access-jr97r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:00:03 crc kubenswrapper[4628]: I1211 06:00:03.027355 4628 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c5892c32-a678-4f96-aaa9-03f39b6fe036-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 06:00:03 crc kubenswrapper[4628]: I1211 06:00:03.027387 4628 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c5892c32-a678-4f96-aaa9-03f39b6fe036-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 06:00:03 crc kubenswrapper[4628]: I1211 06:00:03.027396 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr97r\" (UniqueName: \"kubernetes.io/projected/c5892c32-a678-4f96-aaa9-03f39b6fe036-kube-api-access-jr97r\") on node \"crc\" DevicePath \"\"" Dec 11 06:00:03 crc kubenswrapper[4628]: I1211 06:00:03.440209 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" event={"ID":"c5892c32-a678-4f96-aaa9-03f39b6fe036","Type":"ContainerDied","Data":"0e11887493322a0c3c770b35caf4e8768e83cdd05cfcdb51e88c300d931ae49d"} Dec 11 06:00:03 crc kubenswrapper[4628]: I1211 06:00:03.440247 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e11887493322a0c3c770b35caf4e8768e83cdd05cfcdb51e88c300d931ae49d" Dec 11 06:00:03 crc kubenswrapper[4628]: I1211 06:00:03.440290 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4" Dec 11 06:00:03 crc kubenswrapper[4628]: I1211 06:00:03.877799 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch"] Dec 11 06:00:03 crc kubenswrapper[4628]: I1211 06:00:03.884433 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423835-4gtch"] Dec 11 06:00:03 crc kubenswrapper[4628]: I1211 06:00:03.900469 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="209cebdd-7761-42a6-9bf1-089cc06c3dca" path="/var/lib/kubelet/pods/209cebdd-7761-42a6-9bf1-089cc06c3dca/volumes" Dec 11 06:00:28 crc kubenswrapper[4628]: I1211 06:00:28.275135 4628 scope.go:117] "RemoveContainer" containerID="ea28b50587a764088b354a86fa7c17a4227fe9490d614e081e53ac7aeb376396" Dec 11 06:00:31 crc kubenswrapper[4628]: I1211 06:00:31.427522 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:00:31 crc kubenswrapper[4628]: I1211 06:00:31.428022 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:00:31 crc kubenswrapper[4628]: I1211 06:00:31.428065 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 06:00:31 crc kubenswrapper[4628]: I1211 06:00:31.428888 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"522036b9d0739c4f5dc7b8387fb0052860bb6519261fec1597dd6379d1a48f3b"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 06:00:31 crc kubenswrapper[4628]: I1211 06:00:31.428933 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://522036b9d0739c4f5dc7b8387fb0052860bb6519261fec1597dd6379d1a48f3b" gracePeriod=600 Dec 11 06:00:31 crc kubenswrapper[4628]: I1211 06:00:31.712631 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="522036b9d0739c4f5dc7b8387fb0052860bb6519261fec1597dd6379d1a48f3b" exitCode=0 Dec 11 06:00:31 crc kubenswrapper[4628]: I1211 06:00:31.712677 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"522036b9d0739c4f5dc7b8387fb0052860bb6519261fec1597dd6379d1a48f3b"} Dec 11 06:00:31 crc kubenswrapper[4628]: I1211 06:00:31.712746 4628 scope.go:117] "RemoveContainer" containerID="bd43c9482798e87ae96e8bb946acba48679610b1617b7d96dd3c1a981ebf31b0" Dec 11 06:00:32 crc kubenswrapper[4628]: I1211 06:00:32.725400 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944"} Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.156805 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x4d8l"] Dec 11 06:00:41 crc kubenswrapper[4628]: E1211 06:00:41.158154 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5892c32-a678-4f96-aaa9-03f39b6fe036" containerName="collect-profiles" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.158179 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5892c32-a678-4f96-aaa9-03f39b6fe036" containerName="collect-profiles" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.158542 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5892c32-a678-4f96-aaa9-03f39b6fe036" containerName="collect-profiles" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.160931 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.184058 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4d8l"] Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.196231 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-catalog-content\") pod \"redhat-marketplace-x4d8l\" (UID: \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\") " pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.196332 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhbrh\" (UniqueName: \"kubernetes.io/projected/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-kube-api-access-nhbrh\") pod \"redhat-marketplace-x4d8l\" (UID: \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\") " pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.196406 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-utilities\") pod \"redhat-marketplace-x4d8l\" (UID: \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\") " pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.298420 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-catalog-content\") pod \"redhat-marketplace-x4d8l\" (UID: \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\") " pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.298525 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhbrh\" (UniqueName: \"kubernetes.io/projected/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-kube-api-access-nhbrh\") pod \"redhat-marketplace-x4d8l\" (UID: \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\") " pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.298577 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-utilities\") pod \"redhat-marketplace-x4d8l\" (UID: \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\") " pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.298973 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-catalog-content\") pod \"redhat-marketplace-x4d8l\" (UID: \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\") " pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.299091 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-utilities\") pod \"redhat-marketplace-x4d8l\" (UID: \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\") " pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.317538 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhbrh\" (UniqueName: \"kubernetes.io/projected/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-kube-api-access-nhbrh\") pod \"redhat-marketplace-x4d8l\" (UID: \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\") " pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.494226 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:41 crc kubenswrapper[4628]: I1211 06:00:41.983156 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4d8l"] Dec 11 06:00:41 crc kubenswrapper[4628]: W1211 06:00:41.994138 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d7b55d_5cac_432f_babc_7a2bfadd28a3.slice/crio-34b0b80902dfb161c73643263dcda4aa7fc64b7bd28c0394878d417d2cb153c0 WatchSource:0}: Error finding container 34b0b80902dfb161c73643263dcda4aa7fc64b7bd28c0394878d417d2cb153c0: Status 404 returned error can't find the container with id 34b0b80902dfb161c73643263dcda4aa7fc64b7bd28c0394878d417d2cb153c0 Dec 11 06:00:42 crc kubenswrapper[4628]: I1211 06:00:42.844901 4628 generic.go:334] "Generic (PLEG): container finished" podID="e6d7b55d-5cac-432f-babc-7a2bfadd28a3" containerID="35ffc24e2195969369f35052fa2223046c5848e9f2dd9d3adde0106c4073e69c" exitCode=0 Dec 11 06:00:42 crc kubenswrapper[4628]: I1211 06:00:42.844951 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4d8l" event={"ID":"e6d7b55d-5cac-432f-babc-7a2bfadd28a3","Type":"ContainerDied","Data":"35ffc24e2195969369f35052fa2223046c5848e9f2dd9d3adde0106c4073e69c"} Dec 11 06:00:42 crc kubenswrapper[4628]: I1211 06:00:42.844980 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4d8l" event={"ID":"e6d7b55d-5cac-432f-babc-7a2bfadd28a3","Type":"ContainerStarted","Data":"34b0b80902dfb161c73643263dcda4aa7fc64b7bd28c0394878d417d2cb153c0"} Dec 11 06:00:43 crc kubenswrapper[4628]: I1211 06:00:43.859142 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4d8l" event={"ID":"e6d7b55d-5cac-432f-babc-7a2bfadd28a3","Type":"ContainerStarted","Data":"1dceed165881699074c606b17ad83fb1fb2798502f63defc129336e13f8eb011"} Dec 11 06:00:44 crc kubenswrapper[4628]: I1211 06:00:44.872386 4628 generic.go:334] "Generic (PLEG): container finished" podID="e6d7b55d-5cac-432f-babc-7a2bfadd28a3" containerID="1dceed165881699074c606b17ad83fb1fb2798502f63defc129336e13f8eb011" exitCode=0 Dec 11 06:00:44 crc kubenswrapper[4628]: I1211 06:00:44.872441 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4d8l" event={"ID":"e6d7b55d-5cac-432f-babc-7a2bfadd28a3","Type":"ContainerDied","Data":"1dceed165881699074c606b17ad83fb1fb2798502f63defc129336e13f8eb011"} Dec 11 06:00:45 crc kubenswrapper[4628]: I1211 06:00:45.885995 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4d8l" event={"ID":"e6d7b55d-5cac-432f-babc-7a2bfadd28a3","Type":"ContainerStarted","Data":"8dffbf572b659863019e4075479518571a44adb5961fad3c14e4de769447785e"} Dec 11 06:00:51 crc kubenswrapper[4628]: I1211 06:00:51.494652 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:51 crc kubenswrapper[4628]: I1211 06:00:51.495371 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:51 crc kubenswrapper[4628]: I1211 06:00:51.571374 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:51 crc kubenswrapper[4628]: I1211 06:00:51.606443 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x4d8l" podStartSLOduration=8.113337478 podStartE2EDuration="10.606418317s" podCreationTimestamp="2025-12-11 06:00:41 +0000 UTC" firstStartedPulling="2025-12-11 06:00:42.848714807 +0000 UTC m=+2745.266061516" lastFinishedPulling="2025-12-11 06:00:45.341795657 +0000 UTC m=+2747.759142355" observedRunningTime="2025-12-11 06:00:45.905386505 +0000 UTC m=+2748.322733213" watchObservedRunningTime="2025-12-11 06:00:51.606418317 +0000 UTC m=+2754.023765045" Dec 11 06:00:52 crc kubenswrapper[4628]: I1211 06:00:52.032642 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:52 crc kubenswrapper[4628]: I1211 06:00:52.092158 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4d8l"] Dec 11 06:00:53 crc kubenswrapper[4628]: I1211 06:00:53.998291 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x4d8l" podUID="e6d7b55d-5cac-432f-babc-7a2bfadd28a3" containerName="registry-server" containerID="cri-o://8dffbf572b659863019e4075479518571a44adb5961fad3c14e4de769447785e" gracePeriod=2 Dec 11 06:00:55 crc kubenswrapper[4628]: I1211 06:00:55.105537 4628 generic.go:334] "Generic (PLEG): container finished" podID="e6d7b55d-5cac-432f-babc-7a2bfadd28a3" containerID="8dffbf572b659863019e4075479518571a44adb5961fad3c14e4de769447785e" exitCode=0 Dec 11 06:00:55 crc kubenswrapper[4628]: I1211 06:00:55.106302 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4d8l" event={"ID":"e6d7b55d-5cac-432f-babc-7a2bfadd28a3","Type":"ContainerDied","Data":"8dffbf572b659863019e4075479518571a44adb5961fad3c14e4de769447785e"} Dec 11 06:00:55 crc kubenswrapper[4628]: I1211 06:00:55.432381 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:55 crc kubenswrapper[4628]: I1211 06:00:55.606120 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhbrh\" (UniqueName: \"kubernetes.io/projected/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-kube-api-access-nhbrh\") pod \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\" (UID: \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\") " Dec 11 06:00:55 crc kubenswrapper[4628]: I1211 06:00:55.606311 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-catalog-content\") pod \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\" (UID: \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\") " Dec 11 06:00:55 crc kubenswrapper[4628]: I1211 06:00:55.606379 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-utilities\") pod \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\" (UID: \"e6d7b55d-5cac-432f-babc-7a2bfadd28a3\") " Dec 11 06:00:55 crc kubenswrapper[4628]: I1211 06:00:55.607958 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-utilities" (OuterVolumeSpecName: "utilities") pod "e6d7b55d-5cac-432f-babc-7a2bfadd28a3" (UID: "e6d7b55d-5cac-432f-babc-7a2bfadd28a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:00:55 crc kubenswrapper[4628]: I1211 06:00:55.614595 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-kube-api-access-nhbrh" (OuterVolumeSpecName: "kube-api-access-nhbrh") pod "e6d7b55d-5cac-432f-babc-7a2bfadd28a3" (UID: "e6d7b55d-5cac-432f-babc-7a2bfadd28a3"). InnerVolumeSpecName "kube-api-access-nhbrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:00:55 crc kubenswrapper[4628]: I1211 06:00:55.630118 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6d7b55d-5cac-432f-babc-7a2bfadd28a3" (UID: "e6d7b55d-5cac-432f-babc-7a2bfadd28a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:00:55 crc kubenswrapper[4628]: I1211 06:00:55.708217 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:00:55 crc kubenswrapper[4628]: I1211 06:00:55.708546 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:00:55 crc kubenswrapper[4628]: I1211 06:00:55.708557 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhbrh\" (UniqueName: \"kubernetes.io/projected/e6d7b55d-5cac-432f-babc-7a2bfadd28a3-kube-api-access-nhbrh\") on node \"crc\" DevicePath \"\"" Dec 11 06:00:56 crc kubenswrapper[4628]: I1211 06:00:56.133361 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x4d8l" event={"ID":"e6d7b55d-5cac-432f-babc-7a2bfadd28a3","Type":"ContainerDied","Data":"34b0b80902dfb161c73643263dcda4aa7fc64b7bd28c0394878d417d2cb153c0"} Dec 11 06:00:56 crc kubenswrapper[4628]: I1211 06:00:56.133415 4628 scope.go:117] "RemoveContainer" containerID="8dffbf572b659863019e4075479518571a44adb5961fad3c14e4de769447785e" Dec 11 06:00:56 crc kubenswrapper[4628]: I1211 06:00:56.133446 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x4d8l" Dec 11 06:00:56 crc kubenswrapper[4628]: I1211 06:00:56.160837 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4d8l"] Dec 11 06:00:56 crc kubenswrapper[4628]: I1211 06:00:56.165535 4628 scope.go:117] "RemoveContainer" containerID="1dceed165881699074c606b17ad83fb1fb2798502f63defc129336e13f8eb011" Dec 11 06:00:56 crc kubenswrapper[4628]: I1211 06:00:56.175662 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x4d8l"] Dec 11 06:00:56 crc kubenswrapper[4628]: I1211 06:00:56.197440 4628 scope.go:117] "RemoveContainer" containerID="35ffc24e2195969369f35052fa2223046c5848e9f2dd9d3adde0106c4073e69c" Dec 11 06:00:57 crc kubenswrapper[4628]: I1211 06:00:57.912983 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d7b55d-5cac-432f-babc-7a2bfadd28a3" path="/var/lib/kubelet/pods/e6d7b55d-5cac-432f-babc-7a2bfadd28a3/volumes" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.151150 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29423881-sxmlb"] Dec 11 06:01:00 crc kubenswrapper[4628]: E1211 06:01:00.152106 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d7b55d-5cac-432f-babc-7a2bfadd28a3" containerName="extract-content" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.152158 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d7b55d-5cac-432f-babc-7a2bfadd28a3" containerName="extract-content" Dec 11 06:01:00 crc kubenswrapper[4628]: E1211 06:01:00.152201 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d7b55d-5cac-432f-babc-7a2bfadd28a3" containerName="extract-utilities" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.152213 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d7b55d-5cac-432f-babc-7a2bfadd28a3" containerName="extract-utilities" Dec 11 06:01:00 crc kubenswrapper[4628]: E1211 06:01:00.152261 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d7b55d-5cac-432f-babc-7a2bfadd28a3" containerName="registry-server" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.152274 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d7b55d-5cac-432f-babc-7a2bfadd28a3" containerName="registry-server" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.152532 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d7b55d-5cac-432f-babc-7a2bfadd28a3" containerName="registry-server" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.154060 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.187088 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29423881-sxmlb"] Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.214629 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc4rw\" (UniqueName: \"kubernetes.io/projected/c82db411-744d-4cc8-8ae5-3031c70241d4-kube-api-access-wc4rw\") pod \"keystone-cron-29423881-sxmlb\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.214838 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-config-data\") pod \"keystone-cron-29423881-sxmlb\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.214956 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-fernet-keys\") pod \"keystone-cron-29423881-sxmlb\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.215065 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-combined-ca-bundle\") pod \"keystone-cron-29423881-sxmlb\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.316719 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-config-data\") pod \"keystone-cron-29423881-sxmlb\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.316897 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-fernet-keys\") pod \"keystone-cron-29423881-sxmlb\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.317066 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-combined-ca-bundle\") pod \"keystone-cron-29423881-sxmlb\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.317308 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc4rw\" (UniqueName: \"kubernetes.io/projected/c82db411-744d-4cc8-8ae5-3031c70241d4-kube-api-access-wc4rw\") pod \"keystone-cron-29423881-sxmlb\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.325586 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-combined-ca-bundle\") pod \"keystone-cron-29423881-sxmlb\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.328539 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-config-data\") pod \"keystone-cron-29423881-sxmlb\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.329327 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-fernet-keys\") pod \"keystone-cron-29423881-sxmlb\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.345124 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc4rw\" (UniqueName: \"kubernetes.io/projected/c82db411-744d-4cc8-8ae5-3031c70241d4-kube-api-access-wc4rw\") pod \"keystone-cron-29423881-sxmlb\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.478408 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:00 crc kubenswrapper[4628]: I1211 06:01:00.951096 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29423881-sxmlb"] Dec 11 06:01:01 crc kubenswrapper[4628]: I1211 06:01:01.192695 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29423881-sxmlb" event={"ID":"c82db411-744d-4cc8-8ae5-3031c70241d4","Type":"ContainerStarted","Data":"2753a214c97e94fa249cc62cb84b65fcb5507685df5c4f185b470a8ca904c8ac"} Dec 11 06:01:01 crc kubenswrapper[4628]: I1211 06:01:01.192789 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29423881-sxmlb" event={"ID":"c82db411-744d-4cc8-8ae5-3031c70241d4","Type":"ContainerStarted","Data":"4051f0ae345313d51b9c000dd9b7d3d32d23e7554a19cd692ecb7dda41acc5f1"} Dec 11 06:01:01 crc kubenswrapper[4628]: I1211 06:01:01.216977 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29423881-sxmlb" podStartSLOduration=1.216822522 podStartE2EDuration="1.216822522s" podCreationTimestamp="2025-12-11 06:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 06:01:01.213619395 +0000 UTC m=+2763.630966093" watchObservedRunningTime="2025-12-11 06:01:01.216822522 +0000 UTC m=+2763.634169230" Dec 11 06:01:04 crc kubenswrapper[4628]: I1211 06:01:04.232220 4628 generic.go:334] "Generic (PLEG): container finished" podID="c82db411-744d-4cc8-8ae5-3031c70241d4" containerID="2753a214c97e94fa249cc62cb84b65fcb5507685df5c4f185b470a8ca904c8ac" exitCode=0 Dec 11 06:01:04 crc kubenswrapper[4628]: I1211 06:01:04.232313 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29423881-sxmlb" event={"ID":"c82db411-744d-4cc8-8ae5-3031c70241d4","Type":"ContainerDied","Data":"2753a214c97e94fa249cc62cb84b65fcb5507685df5c4f185b470a8ca904c8ac"} Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.605247 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.630564 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-fernet-keys\") pod \"c82db411-744d-4cc8-8ae5-3031c70241d4\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.631962 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-combined-ca-bundle\") pod \"c82db411-744d-4cc8-8ae5-3031c70241d4\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.632100 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-config-data\") pod \"c82db411-744d-4cc8-8ae5-3031c70241d4\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.632250 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc4rw\" (UniqueName: \"kubernetes.io/projected/c82db411-744d-4cc8-8ae5-3031c70241d4-kube-api-access-wc4rw\") pod \"c82db411-744d-4cc8-8ae5-3031c70241d4\" (UID: \"c82db411-744d-4cc8-8ae5-3031c70241d4\") " Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.643060 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c82db411-744d-4cc8-8ae5-3031c70241d4" (UID: "c82db411-744d-4cc8-8ae5-3031c70241d4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.643365 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c82db411-744d-4cc8-8ae5-3031c70241d4-kube-api-access-wc4rw" (OuterVolumeSpecName: "kube-api-access-wc4rw") pod "c82db411-744d-4cc8-8ae5-3031c70241d4" (UID: "c82db411-744d-4cc8-8ae5-3031c70241d4"). InnerVolumeSpecName "kube-api-access-wc4rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.669274 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c82db411-744d-4cc8-8ae5-3031c70241d4" (UID: "c82db411-744d-4cc8-8ae5-3031c70241d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.702154 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-config-data" (OuterVolumeSpecName: "config-data") pod "c82db411-744d-4cc8-8ae5-3031c70241d4" (UID: "c82db411-744d-4cc8-8ae5-3031c70241d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.734886 4628 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.734918 4628 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.734932 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82db411-744d-4cc8-8ae5-3031c70241d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 06:01:05 crc kubenswrapper[4628]: I1211 06:01:05.735081 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc4rw\" (UniqueName: \"kubernetes.io/projected/c82db411-744d-4cc8-8ae5-3031c70241d4-kube-api-access-wc4rw\") on node \"crc\" DevicePath \"\"" Dec 11 06:01:06 crc kubenswrapper[4628]: I1211 06:01:06.252296 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29423881-sxmlb" event={"ID":"c82db411-744d-4cc8-8ae5-3031c70241d4","Type":"ContainerDied","Data":"4051f0ae345313d51b9c000dd9b7d3d32d23e7554a19cd692ecb7dda41acc5f1"} Dec 11 06:01:06 crc kubenswrapper[4628]: I1211 06:01:06.252342 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4051f0ae345313d51b9c000dd9b7d3d32d23e7554a19cd692ecb7dda41acc5f1" Dec 11 06:01:06 crc kubenswrapper[4628]: I1211 06:01:06.252401 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29423881-sxmlb" Dec 11 06:01:51 crc kubenswrapper[4628]: I1211 06:01:51.715103 4628 generic.go:334] "Generic (PLEG): container finished" podID="70e52eb8-3a47-4192-9d87-3178a99becfe" containerID="0995474904b5a1d378a24237e02b85c718e082742b040bf2d7bfef54d21966e1" exitCode=0 Dec 11 06:01:51 crc kubenswrapper[4628]: I1211 06:01:51.715183 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" event={"ID":"70e52eb8-3a47-4192-9d87-3178a99becfe","Type":"ContainerDied","Data":"0995474904b5a1d378a24237e02b85c718e082742b040bf2d7bfef54d21966e1"} Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.164461 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.272142 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-2\") pod \"70e52eb8-3a47-4192-9d87-3178a99becfe\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.272313 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ssh-key\") pod \"70e52eb8-3a47-4192-9d87-3178a99becfe\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.272343 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-inventory\") pod \"70e52eb8-3a47-4192-9d87-3178a99becfe\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.272415 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-telemetry-combined-ca-bundle\") pod \"70e52eb8-3a47-4192-9d87-3178a99becfe\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.272434 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-0\") pod \"70e52eb8-3a47-4192-9d87-3178a99becfe\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.272449 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-1\") pod \"70e52eb8-3a47-4192-9d87-3178a99becfe\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.272467 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t424\" (UniqueName: \"kubernetes.io/projected/70e52eb8-3a47-4192-9d87-3178a99becfe-kube-api-access-6t424\") pod \"70e52eb8-3a47-4192-9d87-3178a99becfe\" (UID: \"70e52eb8-3a47-4192-9d87-3178a99becfe\") " Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.278947 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e52eb8-3a47-4192-9d87-3178a99becfe-kube-api-access-6t424" (OuterVolumeSpecName: "kube-api-access-6t424") pod "70e52eb8-3a47-4192-9d87-3178a99becfe" (UID: "70e52eb8-3a47-4192-9d87-3178a99becfe"). InnerVolumeSpecName "kube-api-access-6t424". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.291182 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "70e52eb8-3a47-4192-9d87-3178a99becfe" (UID: "70e52eb8-3a47-4192-9d87-3178a99becfe"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.301259 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "70e52eb8-3a47-4192-9d87-3178a99becfe" (UID: "70e52eb8-3a47-4192-9d87-3178a99becfe"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.303581 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-inventory" (OuterVolumeSpecName: "inventory") pod "70e52eb8-3a47-4192-9d87-3178a99becfe" (UID: "70e52eb8-3a47-4192-9d87-3178a99becfe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.303945 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "70e52eb8-3a47-4192-9d87-3178a99becfe" (UID: "70e52eb8-3a47-4192-9d87-3178a99becfe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.312354 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "70e52eb8-3a47-4192-9d87-3178a99becfe" (UID: "70e52eb8-3a47-4192-9d87-3178a99becfe"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.333053 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "70e52eb8-3a47-4192-9d87-3178a99becfe" (UID: "70e52eb8-3a47-4192-9d87-3178a99becfe"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.374823 4628 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.374877 4628 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.374897 4628 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.374913 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t424\" (UniqueName: \"kubernetes.io/projected/70e52eb8-3a47-4192-9d87-3178a99becfe-kube-api-access-6t424\") on node \"crc\" DevicePath \"\"" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.374925 4628 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.374937 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.374948 4628 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70e52eb8-3a47-4192-9d87-3178a99becfe-inventory\") on node \"crc\" DevicePath \"\"" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.735331 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" event={"ID":"70e52eb8-3a47-4192-9d87-3178a99becfe","Type":"ContainerDied","Data":"c0f3ee33017d13b43616c1b0e41af5586b950b81773191ee78ad8cee9b98ce74"} Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.735608 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0f3ee33017d13b43616c1b0e41af5586b950b81773191ee78ad8cee9b98ce74" Dec 11 06:01:53 crc kubenswrapper[4628]: I1211 06:01:53.735416 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-f28db" Dec 11 06:02:31 crc kubenswrapper[4628]: I1211 06:02:31.426594 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:02:31 crc kubenswrapper[4628]: I1211 06:02:31.427183 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.800767 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 11 06:02:46 crc kubenswrapper[4628]: E1211 06:02:46.801735 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82db411-744d-4cc8-8ae5-3031c70241d4" containerName="keystone-cron" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.801756 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82db411-744d-4cc8-8ae5-3031c70241d4" containerName="keystone-cron" Dec 11 06:02:46 crc kubenswrapper[4628]: E1211 06:02:46.801773 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e52eb8-3a47-4192-9d87-3178a99becfe" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.801782 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e52eb8-3a47-4192-9d87-3178a99becfe" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.802047 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="c82db411-744d-4cc8-8ae5-3031c70241d4" containerName="keystone-cron" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.802074 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e52eb8-3a47-4192-9d87-3178a99becfe" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.802873 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.807799 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.808247 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.815770 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.815828 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qq5gj" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.848928 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.973342 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bdf75bdb-5535-4134-b9aa-f094e9e220fc-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.973411 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.973513 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdf75bdb-5535-4134-b9aa-f094e9e220fc-config-data\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.974376 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.974437 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.974532 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf75bdb-5535-4134-b9aa-f094e9e220fc-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.974580 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85r47\" (UniqueName: \"kubernetes.io/projected/bdf75bdb-5535-4134-b9aa-f094e9e220fc-kube-api-access-85r47\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.974625 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bdf75bdb-5535-4134-b9aa-f094e9e220fc-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:46 crc kubenswrapper[4628]: I1211 06:02:46.974789 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.076874 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bdf75bdb-5535-4134-b9aa-f094e9e220fc-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.076932 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.076965 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdf75bdb-5535-4134-b9aa-f094e9e220fc-config-data\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.076994 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.077027 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.077078 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85r47\" (UniqueName: \"kubernetes.io/projected/bdf75bdb-5535-4134-b9aa-f094e9e220fc-kube-api-access-85r47\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.077099 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf75bdb-5535-4134-b9aa-f094e9e220fc-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.077132 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bdf75bdb-5535-4134-b9aa-f094e9e220fc-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.077201 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.077391 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bdf75bdb-5535-4134-b9aa-f094e9e220fc-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.078356 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bdf75bdb-5535-4134-b9aa-f094e9e220fc-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.078545 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.080417 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdf75bdb-5535-4134-b9aa-f094e9e220fc-config-data\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.082722 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf75bdb-5535-4134-b9aa-f094e9e220fc-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.085180 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.085310 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.092650 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.095588 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85r47\" (UniqueName: \"kubernetes.io/projected/bdf75bdb-5535-4134-b9aa-f094e9e220fc-kube-api-access-85r47\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.109913 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.154756 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 11 06:02:47 crc kubenswrapper[4628]: I1211 06:02:47.624354 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 11 06:02:48 crc kubenswrapper[4628]: I1211 06:02:48.246487 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bdf75bdb-5535-4134-b9aa-f094e9e220fc","Type":"ContainerStarted","Data":"e6686896659ba93dfb262c5e6243e2cd19f94b82898e69da4bb44fd84472f67d"} Dec 11 06:03:00 crc kubenswrapper[4628]: I1211 06:03:00.109046 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kd74s"] Dec 11 06:03:00 crc kubenswrapper[4628]: I1211 06:03:00.114010 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:00 crc kubenswrapper[4628]: I1211 06:03:00.123985 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kd74s"] Dec 11 06:03:00 crc kubenswrapper[4628]: I1211 06:03:00.161924 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xszm\" (UniqueName: \"kubernetes.io/projected/177135b1-4b8a-4781-ae23-3f595ee3ab6f-kube-api-access-5xszm\") pod \"redhat-operators-kd74s\" (UID: \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\") " pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:00 crc kubenswrapper[4628]: I1211 06:03:00.164717 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177135b1-4b8a-4781-ae23-3f595ee3ab6f-catalog-content\") pod \"redhat-operators-kd74s\" (UID: \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\") " pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:00 crc kubenswrapper[4628]: I1211 06:03:00.164804 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177135b1-4b8a-4781-ae23-3f595ee3ab6f-utilities\") pod \"redhat-operators-kd74s\" (UID: \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\") " pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:00 crc kubenswrapper[4628]: I1211 06:03:00.266207 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177135b1-4b8a-4781-ae23-3f595ee3ab6f-catalog-content\") pod \"redhat-operators-kd74s\" (UID: \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\") " pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:00 crc kubenswrapper[4628]: I1211 06:03:00.266765 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177135b1-4b8a-4781-ae23-3f595ee3ab6f-utilities\") pod \"redhat-operators-kd74s\" (UID: \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\") " pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:00 crc kubenswrapper[4628]: I1211 06:03:00.267677 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xszm\" (UniqueName: \"kubernetes.io/projected/177135b1-4b8a-4781-ae23-3f595ee3ab6f-kube-api-access-5xszm\") pod \"redhat-operators-kd74s\" (UID: \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\") " pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:00 crc kubenswrapper[4628]: I1211 06:03:00.267386 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177135b1-4b8a-4781-ae23-3f595ee3ab6f-utilities\") pod \"redhat-operators-kd74s\" (UID: \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\") " pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:00 crc kubenswrapper[4628]: I1211 06:03:00.267153 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177135b1-4b8a-4781-ae23-3f595ee3ab6f-catalog-content\") pod \"redhat-operators-kd74s\" (UID: \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\") " pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:01 crc kubenswrapper[4628]: I1211 06:03:01.187182 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xszm\" (UniqueName: \"kubernetes.io/projected/177135b1-4b8a-4781-ae23-3f595ee3ab6f-kube-api-access-5xszm\") pod \"redhat-operators-kd74s\" (UID: \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\") " pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:01 crc kubenswrapper[4628]: I1211 06:03:01.332255 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:01 crc kubenswrapper[4628]: I1211 06:03:01.426436 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:03:01 crc kubenswrapper[4628]: I1211 06:03:01.426488 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:03:25 crc kubenswrapper[4628]: E1211 06:03:25.782343 4628 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 11 06:03:25 crc kubenswrapper[4628]: E1211 06:03:25.783730 4628 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-85r47,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(bdf75bdb-5535-4134-b9aa-f094e9e220fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 11 06:03:25 crc kubenswrapper[4628]: E1211 06:03:25.785538 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="bdf75bdb-5535-4134-b9aa-f094e9e220fc" Dec 11 06:03:26 crc kubenswrapper[4628]: I1211 06:03:26.239661 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kd74s"] Dec 11 06:03:26 crc kubenswrapper[4628]: I1211 06:03:26.657191 4628 generic.go:334] "Generic (PLEG): container finished" podID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" containerID="2012188b1ca12ca0c8374bdfa3a94aaf3a7678a95abed03eac569d5ba99b7035" exitCode=0 Dec 11 06:03:26 crc kubenswrapper[4628]: I1211 06:03:26.657332 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kd74s" event={"ID":"177135b1-4b8a-4781-ae23-3f595ee3ab6f","Type":"ContainerDied","Data":"2012188b1ca12ca0c8374bdfa3a94aaf3a7678a95abed03eac569d5ba99b7035"} Dec 11 06:03:26 crc kubenswrapper[4628]: I1211 06:03:26.657539 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kd74s" event={"ID":"177135b1-4b8a-4781-ae23-3f595ee3ab6f","Type":"ContainerStarted","Data":"16455f72382f5ef8047018cccf2e60ca271e21097879271deac14d131b77cea2"} Dec 11 06:03:26 crc kubenswrapper[4628]: E1211 06:03:26.658913 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="bdf75bdb-5535-4134-b9aa-f094e9e220fc" Dec 11 06:03:26 crc kubenswrapper[4628]: I1211 06:03:26.660195 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 06:03:28 crc kubenswrapper[4628]: I1211 06:03:28.676500 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kd74s" event={"ID":"177135b1-4b8a-4781-ae23-3f595ee3ab6f","Type":"ContainerStarted","Data":"66f3a93818e4350946911528e73b967e64bc5a45087a37b2750d65d57122f270"} Dec 11 06:03:31 crc kubenswrapper[4628]: I1211 06:03:31.426797 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:03:31 crc kubenswrapper[4628]: I1211 06:03:31.427123 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:03:31 crc kubenswrapper[4628]: I1211 06:03:31.427174 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 06:03:31 crc kubenswrapper[4628]: I1211 06:03:31.427992 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 06:03:31 crc kubenswrapper[4628]: I1211 06:03:31.428053 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" gracePeriod=600 Dec 11 06:03:32 crc kubenswrapper[4628]: E1211 06:03:32.670373 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:03:32 crc kubenswrapper[4628]: I1211 06:03:32.704566 4628 generic.go:334] "Generic (PLEG): container finished" podID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" containerID="66f3a93818e4350946911528e73b967e64bc5a45087a37b2750d65d57122f270" exitCode=0 Dec 11 06:03:32 crc kubenswrapper[4628]: I1211 06:03:32.704626 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kd74s" event={"ID":"177135b1-4b8a-4781-ae23-3f595ee3ab6f","Type":"ContainerDied","Data":"66f3a93818e4350946911528e73b967e64bc5a45087a37b2750d65d57122f270"} Dec 11 06:03:32 crc kubenswrapper[4628]: I1211 06:03:32.708861 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" exitCode=0 Dec 11 06:03:32 crc kubenswrapper[4628]: I1211 06:03:32.708901 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944"} Dec 11 06:03:32 crc kubenswrapper[4628]: I1211 06:03:32.708933 4628 scope.go:117] "RemoveContainer" containerID="522036b9d0739c4f5dc7b8387fb0052860bb6519261fec1597dd6379d1a48f3b" Dec 11 06:03:32 crc kubenswrapper[4628]: I1211 06:03:32.709485 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:03:32 crc kubenswrapper[4628]: E1211 06:03:32.709689 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:03:33 crc kubenswrapper[4628]: I1211 06:03:33.720597 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kd74s" event={"ID":"177135b1-4b8a-4781-ae23-3f595ee3ab6f","Type":"ContainerStarted","Data":"ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80"} Dec 11 06:03:33 crc kubenswrapper[4628]: I1211 06:03:33.747390 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kd74s" podStartSLOduration=26.953507288 podStartE2EDuration="33.747365729s" podCreationTimestamp="2025-12-11 06:03:00 +0000 UTC" firstStartedPulling="2025-12-11 06:03:26.659888589 +0000 UTC m=+2909.077235287" lastFinishedPulling="2025-12-11 06:03:33.45374703 +0000 UTC m=+2915.871093728" observedRunningTime="2025-12-11 06:03:33.737618035 +0000 UTC m=+2916.154964733" watchObservedRunningTime="2025-12-11 06:03:33.747365729 +0000 UTC m=+2916.164712467" Dec 11 06:03:41 crc kubenswrapper[4628]: I1211 06:03:41.333376 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:41 crc kubenswrapper[4628]: I1211 06:03:41.334788 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:42 crc kubenswrapper[4628]: I1211 06:03:42.404395 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kd74s" podUID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" containerName="registry-server" probeResult="failure" output=< Dec 11 06:03:42 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 06:03:42 crc kubenswrapper[4628]: > Dec 11 06:03:42 crc kubenswrapper[4628]: I1211 06:03:42.423355 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 11 06:03:43 crc kubenswrapper[4628]: I1211 06:03:43.830292 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bdf75bdb-5535-4134-b9aa-f094e9e220fc","Type":"ContainerStarted","Data":"37d9c640ee63f39146a5c38040546dd6c6400b496a038d904e4c5b5ef0d04467"} Dec 11 06:03:43 crc kubenswrapper[4628]: I1211 06:03:43.866510 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.072667029 podStartE2EDuration="58.86648993s" podCreationTimestamp="2025-12-11 06:02:45 +0000 UTC" firstStartedPulling="2025-12-11 06:02:47.626328484 +0000 UTC m=+2870.043675182" lastFinishedPulling="2025-12-11 06:03:42.420151385 +0000 UTC m=+2924.837498083" observedRunningTime="2025-12-11 06:03:43.848030471 +0000 UTC m=+2926.265377189" watchObservedRunningTime="2025-12-11 06:03:43.86648993 +0000 UTC m=+2926.283836628" Dec 11 06:03:44 crc kubenswrapper[4628]: I1211 06:03:44.889785 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:03:44 crc kubenswrapper[4628]: E1211 06:03:44.890079 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:03:51 crc kubenswrapper[4628]: I1211 06:03:51.395489 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:51 crc kubenswrapper[4628]: I1211 06:03:51.467013 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:51 crc kubenswrapper[4628]: I1211 06:03:51.643439 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kd74s"] Dec 11 06:03:52 crc kubenswrapper[4628]: I1211 06:03:52.937928 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kd74s" podUID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" containerName="registry-server" containerID="cri-o://ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80" gracePeriod=2 Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.440257 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.591684 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177135b1-4b8a-4781-ae23-3f595ee3ab6f-utilities\") pod \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\" (UID: \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\") " Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.592275 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177135b1-4b8a-4781-ae23-3f595ee3ab6f-catalog-content\") pod \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\" (UID: \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\") " Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.592312 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xszm\" (UniqueName: \"kubernetes.io/projected/177135b1-4b8a-4781-ae23-3f595ee3ab6f-kube-api-access-5xszm\") pod \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\" (UID: \"177135b1-4b8a-4781-ae23-3f595ee3ab6f\") " Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.594111 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/177135b1-4b8a-4781-ae23-3f595ee3ab6f-utilities" (OuterVolumeSpecName: "utilities") pod "177135b1-4b8a-4781-ae23-3f595ee3ab6f" (UID: "177135b1-4b8a-4781-ae23-3f595ee3ab6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.609147 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/177135b1-4b8a-4781-ae23-3f595ee3ab6f-kube-api-access-5xszm" (OuterVolumeSpecName: "kube-api-access-5xszm") pod "177135b1-4b8a-4781-ae23-3f595ee3ab6f" (UID: "177135b1-4b8a-4781-ae23-3f595ee3ab6f"). InnerVolumeSpecName "kube-api-access-5xszm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.695145 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xszm\" (UniqueName: \"kubernetes.io/projected/177135b1-4b8a-4781-ae23-3f595ee3ab6f-kube-api-access-5xszm\") on node \"crc\" DevicePath \"\"" Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.695186 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/177135b1-4b8a-4781-ae23-3f595ee3ab6f-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.735994 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/177135b1-4b8a-4781-ae23-3f595ee3ab6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "177135b1-4b8a-4781-ae23-3f595ee3ab6f" (UID: "177135b1-4b8a-4781-ae23-3f595ee3ab6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.797149 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/177135b1-4b8a-4781-ae23-3f595ee3ab6f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.950909 4628 generic.go:334] "Generic (PLEG): container finished" podID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" containerID="ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80" exitCode=0 Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.950959 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kd74s" event={"ID":"177135b1-4b8a-4781-ae23-3f595ee3ab6f","Type":"ContainerDied","Data":"ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80"} Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.950991 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kd74s" event={"ID":"177135b1-4b8a-4781-ae23-3f595ee3ab6f","Type":"ContainerDied","Data":"16455f72382f5ef8047018cccf2e60ca271e21097879271deac14d131b77cea2"} Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.951013 4628 scope.go:117] "RemoveContainer" containerID="ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80" Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.951148 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kd74s" Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.990637 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kd74s"] Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.991062 4628 scope.go:117] "RemoveContainer" containerID="66f3a93818e4350946911528e73b967e64bc5a45087a37b2750d65d57122f270" Dec 11 06:03:53 crc kubenswrapper[4628]: I1211 06:03:53.999491 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kd74s"] Dec 11 06:03:54 crc kubenswrapper[4628]: I1211 06:03:54.016814 4628 scope.go:117] "RemoveContainer" containerID="2012188b1ca12ca0c8374bdfa3a94aaf3a7678a95abed03eac569d5ba99b7035" Dec 11 06:03:54 crc kubenswrapper[4628]: I1211 06:03:54.075057 4628 scope.go:117] "RemoveContainer" containerID="ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80" Dec 11 06:03:54 crc kubenswrapper[4628]: E1211 06:03:54.075587 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80\": container with ID starting with ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80 not found: ID does not exist" containerID="ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80" Dec 11 06:03:54 crc kubenswrapper[4628]: I1211 06:03:54.075633 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80"} err="failed to get container status \"ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80\": rpc error: code = NotFound desc = could not find container \"ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80\": container with ID starting with ad689cc4db77797a36f12e52a837a16ba21dc3c3b1bd8fc423535f6b79f49b80 not found: ID does not exist" Dec 11 06:03:54 crc kubenswrapper[4628]: I1211 06:03:54.075653 4628 scope.go:117] "RemoveContainer" containerID="66f3a93818e4350946911528e73b967e64bc5a45087a37b2750d65d57122f270" Dec 11 06:03:54 crc kubenswrapper[4628]: E1211 06:03:54.076158 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66f3a93818e4350946911528e73b967e64bc5a45087a37b2750d65d57122f270\": container with ID starting with 66f3a93818e4350946911528e73b967e64bc5a45087a37b2750d65d57122f270 not found: ID does not exist" containerID="66f3a93818e4350946911528e73b967e64bc5a45087a37b2750d65d57122f270" Dec 11 06:03:54 crc kubenswrapper[4628]: I1211 06:03:54.076184 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66f3a93818e4350946911528e73b967e64bc5a45087a37b2750d65d57122f270"} err="failed to get container status \"66f3a93818e4350946911528e73b967e64bc5a45087a37b2750d65d57122f270\": rpc error: code = NotFound desc = could not find container \"66f3a93818e4350946911528e73b967e64bc5a45087a37b2750d65d57122f270\": container with ID starting with 66f3a93818e4350946911528e73b967e64bc5a45087a37b2750d65d57122f270 not found: ID does not exist" Dec 11 06:03:54 crc kubenswrapper[4628]: I1211 06:03:54.076198 4628 scope.go:117] "RemoveContainer" containerID="2012188b1ca12ca0c8374bdfa3a94aaf3a7678a95abed03eac569d5ba99b7035" Dec 11 06:03:54 crc kubenswrapper[4628]: E1211 06:03:54.076402 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2012188b1ca12ca0c8374bdfa3a94aaf3a7678a95abed03eac569d5ba99b7035\": container with ID starting with 2012188b1ca12ca0c8374bdfa3a94aaf3a7678a95abed03eac569d5ba99b7035 not found: ID does not exist" containerID="2012188b1ca12ca0c8374bdfa3a94aaf3a7678a95abed03eac569d5ba99b7035" Dec 11 06:03:54 crc kubenswrapper[4628]: I1211 06:03:54.076431 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2012188b1ca12ca0c8374bdfa3a94aaf3a7678a95abed03eac569d5ba99b7035"} err="failed to get container status \"2012188b1ca12ca0c8374bdfa3a94aaf3a7678a95abed03eac569d5ba99b7035\": rpc error: code = NotFound desc = could not find container \"2012188b1ca12ca0c8374bdfa3a94aaf3a7678a95abed03eac569d5ba99b7035\": container with ID starting with 2012188b1ca12ca0c8374bdfa3a94aaf3a7678a95abed03eac569d5ba99b7035 not found: ID does not exist" Dec 11 06:03:55 crc kubenswrapper[4628]: I1211 06:03:55.910827 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" path="/var/lib/kubelet/pods/177135b1-4b8a-4781-ae23-3f595ee3ab6f/volumes" Dec 11 06:03:59 crc kubenswrapper[4628]: I1211 06:03:59.891505 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:03:59 crc kubenswrapper[4628]: E1211 06:03:59.894936 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:04:14 crc kubenswrapper[4628]: I1211 06:04:14.889239 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:04:14 crc kubenswrapper[4628]: E1211 06:04:14.890053 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:04:27 crc kubenswrapper[4628]: I1211 06:04:27.895823 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:04:27 crc kubenswrapper[4628]: E1211 06:04:27.896449 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:04:38 crc kubenswrapper[4628]: I1211 06:04:38.889826 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:04:38 crc kubenswrapper[4628]: E1211 06:04:38.890902 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:04:51 crc kubenswrapper[4628]: I1211 06:04:51.890973 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:04:51 crc kubenswrapper[4628]: E1211 06:04:51.894033 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:05:06 crc kubenswrapper[4628]: I1211 06:05:06.889666 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:05:06 crc kubenswrapper[4628]: E1211 06:05:06.890475 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:05:19 crc kubenswrapper[4628]: I1211 06:05:19.890308 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:05:19 crc kubenswrapper[4628]: E1211 06:05:19.890970 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:05:32 crc kubenswrapper[4628]: I1211 06:05:32.891391 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:05:32 crc kubenswrapper[4628]: E1211 06:05:32.892571 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.045184 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zf9t5"] Dec 11 06:05:36 crc kubenswrapper[4628]: E1211 06:05:36.045932 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" containerName="extract-utilities" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.045947 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" containerName="extract-utilities" Dec 11 06:05:36 crc kubenswrapper[4628]: E1211 06:05:36.045960 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" containerName="extract-content" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.045966 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" containerName="extract-content" Dec 11 06:05:36 crc kubenswrapper[4628]: E1211 06:05:36.045979 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" containerName="registry-server" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.045985 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" containerName="registry-server" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.046161 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="177135b1-4b8a-4781-ae23-3f595ee3ab6f" containerName="registry-server" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.048058 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.062982 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zf9t5"] Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.148153 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhjpc\" (UniqueName: \"kubernetes.io/projected/2466dfeb-42c9-4a50-9a80-502565709587-kube-api-access-jhjpc\") pod \"certified-operators-zf9t5\" (UID: \"2466dfeb-42c9-4a50-9a80-502565709587\") " pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.148337 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2466dfeb-42c9-4a50-9a80-502565709587-catalog-content\") pod \"certified-operators-zf9t5\" (UID: \"2466dfeb-42c9-4a50-9a80-502565709587\") " pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.148675 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2466dfeb-42c9-4a50-9a80-502565709587-utilities\") pod \"certified-operators-zf9t5\" (UID: \"2466dfeb-42c9-4a50-9a80-502565709587\") " pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.251320 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2466dfeb-42c9-4a50-9a80-502565709587-catalog-content\") pod \"certified-operators-zf9t5\" (UID: \"2466dfeb-42c9-4a50-9a80-502565709587\") " pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.251463 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2466dfeb-42c9-4a50-9a80-502565709587-utilities\") pod \"certified-operators-zf9t5\" (UID: \"2466dfeb-42c9-4a50-9a80-502565709587\") " pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.251510 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhjpc\" (UniqueName: \"kubernetes.io/projected/2466dfeb-42c9-4a50-9a80-502565709587-kube-api-access-jhjpc\") pod \"certified-operators-zf9t5\" (UID: \"2466dfeb-42c9-4a50-9a80-502565709587\") " pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.252081 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2466dfeb-42c9-4a50-9a80-502565709587-catalog-content\") pod \"certified-operators-zf9t5\" (UID: \"2466dfeb-42c9-4a50-9a80-502565709587\") " pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.252259 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2466dfeb-42c9-4a50-9a80-502565709587-utilities\") pod \"certified-operators-zf9t5\" (UID: \"2466dfeb-42c9-4a50-9a80-502565709587\") " pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.271906 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhjpc\" (UniqueName: \"kubernetes.io/projected/2466dfeb-42c9-4a50-9a80-502565709587-kube-api-access-jhjpc\") pod \"certified-operators-zf9t5\" (UID: \"2466dfeb-42c9-4a50-9a80-502565709587\") " pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:36 crc kubenswrapper[4628]: I1211 06:05:36.442276 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:37 crc kubenswrapper[4628]: I1211 06:05:37.264401 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zf9t5"] Dec 11 06:05:37 crc kubenswrapper[4628]: I1211 06:05:37.963431 4628 generic.go:334] "Generic (PLEG): container finished" podID="2466dfeb-42c9-4a50-9a80-502565709587" containerID="edf22d08f97d1d6b8191199361d972678f3febfb616e424436293b5e84f91b5e" exitCode=0 Dec 11 06:05:37 crc kubenswrapper[4628]: I1211 06:05:37.963636 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf9t5" event={"ID":"2466dfeb-42c9-4a50-9a80-502565709587","Type":"ContainerDied","Data":"edf22d08f97d1d6b8191199361d972678f3febfb616e424436293b5e84f91b5e"} Dec 11 06:05:37 crc kubenswrapper[4628]: I1211 06:05:37.963749 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf9t5" event={"ID":"2466dfeb-42c9-4a50-9a80-502565709587","Type":"ContainerStarted","Data":"2cd2780dc2b9bffe155950d227b5c69e615ad795091487c015058eab38ee7de6"} Dec 11 06:05:45 crc kubenswrapper[4628]: I1211 06:05:45.890154 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:05:45 crc kubenswrapper[4628]: E1211 06:05:45.891312 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:05:48 crc kubenswrapper[4628]: I1211 06:05:48.632424 4628 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-czhht container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 11 06:05:48 crc kubenswrapper[4628]: I1211 06:05:48.632768 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-czhht" podUID="a88efb22-2511-4037-9257-102b56de5226" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 11 06:05:48 crc kubenswrapper[4628]: I1211 06:05:48.794826 4628 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="0ba5be80-485c-4b8b-8e1d-3326db7cc5a0" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 11 06:05:48 crc kubenswrapper[4628]: I1211 06:05:48.794930 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="0ba5be80-485c-4b8b-8e1d-3326db7cc5a0" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 11 06:05:49 crc kubenswrapper[4628]: I1211 06:05:49.061937 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf9t5" event={"ID":"2466dfeb-42c9-4a50-9a80-502565709587","Type":"ContainerStarted","Data":"fbea465050247fb0fc3af68c69c263005ab286560b44d17b204323580f787fa9"} Dec 11 06:05:51 crc kubenswrapper[4628]: I1211 06:05:51.084635 4628 generic.go:334] "Generic (PLEG): container finished" podID="2466dfeb-42c9-4a50-9a80-502565709587" containerID="fbea465050247fb0fc3af68c69c263005ab286560b44d17b204323580f787fa9" exitCode=0 Dec 11 06:05:51 crc kubenswrapper[4628]: I1211 06:05:51.084717 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf9t5" event={"ID":"2466dfeb-42c9-4a50-9a80-502565709587","Type":"ContainerDied","Data":"fbea465050247fb0fc3af68c69c263005ab286560b44d17b204323580f787fa9"} Dec 11 06:05:52 crc kubenswrapper[4628]: I1211 06:05:52.099594 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf9t5" event={"ID":"2466dfeb-42c9-4a50-9a80-502565709587","Type":"ContainerStarted","Data":"e92cc3a526637b2cddb5e9d9f7076fa10d89ab32cfb573df293e775f12468ea1"} Dec 11 06:05:52 crc kubenswrapper[4628]: I1211 06:05:52.127063 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zf9t5" podStartSLOduration=2.53047585 podStartE2EDuration="16.127045596s" podCreationTimestamp="2025-12-11 06:05:36 +0000 UTC" firstStartedPulling="2025-12-11 06:05:37.965559578 +0000 UTC m=+3040.382906276" lastFinishedPulling="2025-12-11 06:05:51.562129324 +0000 UTC m=+3053.979476022" observedRunningTime="2025-12-11 06:05:52.120346064 +0000 UTC m=+3054.537692762" watchObservedRunningTime="2025-12-11 06:05:52.127045596 +0000 UTC m=+3054.544392294" Dec 11 06:05:56 crc kubenswrapper[4628]: I1211 06:05:56.443108 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:56 crc kubenswrapper[4628]: I1211 06:05:56.443712 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:56 crc kubenswrapper[4628]: I1211 06:05:56.486799 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:57 crc kubenswrapper[4628]: I1211 06:05:57.195636 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:05:57 crc kubenswrapper[4628]: I1211 06:05:57.276601 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zf9t5"] Dec 11 06:05:57 crc kubenswrapper[4628]: I1211 06:05:57.336890 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qszrs"] Dec 11 06:05:57 crc kubenswrapper[4628]: I1211 06:05:57.337142 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qszrs" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" containerName="registry-server" containerID="cri-o://1662dde031ea99d90bae900963bcc4006659b1f78379200d09f20b9cb03b9a58" gracePeriod=2 Dec 11 06:05:57 crc kubenswrapper[4628]: I1211 06:05:57.916652 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:05:57 crc kubenswrapper[4628]: E1211 06:05:57.917040 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.148906 4628 generic.go:334] "Generic (PLEG): container finished" podID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" containerID="1662dde031ea99d90bae900963bcc4006659b1f78379200d09f20b9cb03b9a58" exitCode=0 Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.149830 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qszrs" event={"ID":"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc","Type":"ContainerDied","Data":"1662dde031ea99d90bae900963bcc4006659b1f78379200d09f20b9cb03b9a58"} Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.149881 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qszrs" event={"ID":"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc","Type":"ContainerDied","Data":"9ca5d856bc9d0ec2bb635083ed90445f1f06d379b104c7ca141937530a3a5be4"} Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.149892 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ca5d856bc9d0ec2bb635083ed90445f1f06d379b104c7ca141937530a3a5be4" Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.158387 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qszrs" Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.244813 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjcmv\" (UniqueName: \"kubernetes.io/projected/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-kube-api-access-mjcmv\") pod \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\" (UID: \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\") " Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.244994 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-utilities\") pod \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\" (UID: \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\") " Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.245035 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-catalog-content\") pod \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\" (UID: \"596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc\") " Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.246374 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-utilities" (OuterVolumeSpecName: "utilities") pod "596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" (UID: "596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.258417 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-kube-api-access-mjcmv" (OuterVolumeSpecName: "kube-api-access-mjcmv") pod "596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" (UID: "596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc"). InnerVolumeSpecName "kube-api-access-mjcmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.321648 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" (UID: "596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.346834 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.346886 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:05:58 crc kubenswrapper[4628]: I1211 06:05:58.346898 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjcmv\" (UniqueName: \"kubernetes.io/projected/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc-kube-api-access-mjcmv\") on node \"crc\" DevicePath \"\"" Dec 11 06:05:59 crc kubenswrapper[4628]: I1211 06:05:59.157003 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qszrs" Dec 11 06:05:59 crc kubenswrapper[4628]: I1211 06:05:59.190171 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qszrs"] Dec 11 06:05:59 crc kubenswrapper[4628]: I1211 06:05:59.198366 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qszrs"] Dec 11 06:05:59 crc kubenswrapper[4628]: I1211 06:05:59.900522 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" path="/var/lib/kubelet/pods/596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc/volumes" Dec 11 06:06:10 crc kubenswrapper[4628]: I1211 06:06:10.889717 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:06:10 crc kubenswrapper[4628]: E1211 06:06:10.890552 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:06:23 crc kubenswrapper[4628]: I1211 06:06:23.890089 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:06:23 crc kubenswrapper[4628]: E1211 06:06:23.890911 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:06:28 crc kubenswrapper[4628]: I1211 06:06:28.555381 4628 scope.go:117] "RemoveContainer" containerID="7dbec7998d2d42bfaebd1295042a1dd4837b2bf2c0653331e24d6b18480698f6" Dec 11 06:06:28 crc kubenswrapper[4628]: I1211 06:06:28.582092 4628 scope.go:117] "RemoveContainer" containerID="04282ed96b10bb7a59ba3deb58b3349bd7a5907b52535d88c99c73a3f7791e8c" Dec 11 06:06:28 crc kubenswrapper[4628]: I1211 06:06:28.629464 4628 scope.go:117] "RemoveContainer" containerID="1662dde031ea99d90bae900963bcc4006659b1f78379200d09f20b9cb03b9a58" Dec 11 06:06:35 crc kubenswrapper[4628]: I1211 06:06:35.889973 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:06:35 crc kubenswrapper[4628]: E1211 06:06:35.890991 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:06:50 crc kubenswrapper[4628]: I1211 06:06:50.890888 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:06:50 crc kubenswrapper[4628]: E1211 06:06:50.891713 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:07:01 crc kubenswrapper[4628]: I1211 06:07:01.889829 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:07:01 crc kubenswrapper[4628]: E1211 06:07:01.890797 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:07:14 crc kubenswrapper[4628]: I1211 06:07:14.890020 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:07:14 crc kubenswrapper[4628]: E1211 06:07:14.890820 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:07:26 crc kubenswrapper[4628]: I1211 06:07:26.890774 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:07:26 crc kubenswrapper[4628]: E1211 06:07:26.891662 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:07:41 crc kubenswrapper[4628]: I1211 06:07:41.889594 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:07:41 crc kubenswrapper[4628]: E1211 06:07:41.890401 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:07:52 crc kubenswrapper[4628]: I1211 06:07:52.890117 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:07:52 crc kubenswrapper[4628]: E1211 06:07:52.890935 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:08:07 crc kubenswrapper[4628]: I1211 06:08:07.896713 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:08:07 crc kubenswrapper[4628]: E1211 06:08:07.897297 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:08:18 crc kubenswrapper[4628]: I1211 06:08:18.889392 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:08:18 crc kubenswrapper[4628]: E1211 06:08:18.890183 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:08:33 crc kubenswrapper[4628]: I1211 06:08:33.889620 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:08:34 crc kubenswrapper[4628]: I1211 06:08:34.594087 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"a9d0674039c93a524e570e8ed6bd4cdf2d9c5d8d2cae45af50102adfdc0ac1b0"} Dec 11 06:11:01 crc kubenswrapper[4628]: I1211 06:11:01.426977 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:11:01 crc kubenswrapper[4628]: I1211 06:11:01.427979 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.513105 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-slrfw"] Dec 11 06:11:30 crc kubenswrapper[4628]: E1211 06:11:30.514077 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" containerName="extract-utilities" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.514091 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" containerName="extract-utilities" Dec 11 06:11:30 crc kubenswrapper[4628]: E1211 06:11:30.514117 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" containerName="extract-content" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.514125 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" containerName="extract-content" Dec 11 06:11:30 crc kubenswrapper[4628]: E1211 06:11:30.514141 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" containerName="registry-server" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.514150 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" containerName="registry-server" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.514413 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="596ff0c0-3caf-4bb3-b49a-ff3d4ed25adc" containerName="registry-server" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.516106 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.566006 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-slrfw"] Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.567460 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4b69\" (UniqueName: \"kubernetes.io/projected/e99689f2-e449-4a63-aee4-2c22e629616a-kube-api-access-b4b69\") pod \"community-operators-slrfw\" (UID: \"e99689f2-e449-4a63-aee4-2c22e629616a\") " pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.567497 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99689f2-e449-4a63-aee4-2c22e629616a-catalog-content\") pod \"community-operators-slrfw\" (UID: \"e99689f2-e449-4a63-aee4-2c22e629616a\") " pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.567566 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99689f2-e449-4a63-aee4-2c22e629616a-utilities\") pod \"community-operators-slrfw\" (UID: \"e99689f2-e449-4a63-aee4-2c22e629616a\") " pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.669137 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4b69\" (UniqueName: \"kubernetes.io/projected/e99689f2-e449-4a63-aee4-2c22e629616a-kube-api-access-b4b69\") pod \"community-operators-slrfw\" (UID: \"e99689f2-e449-4a63-aee4-2c22e629616a\") " pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.669553 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99689f2-e449-4a63-aee4-2c22e629616a-catalog-content\") pod \"community-operators-slrfw\" (UID: \"e99689f2-e449-4a63-aee4-2c22e629616a\") " pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.669673 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99689f2-e449-4a63-aee4-2c22e629616a-utilities\") pod \"community-operators-slrfw\" (UID: \"e99689f2-e449-4a63-aee4-2c22e629616a\") " pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.670227 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99689f2-e449-4a63-aee4-2c22e629616a-utilities\") pod \"community-operators-slrfw\" (UID: \"e99689f2-e449-4a63-aee4-2c22e629616a\") " pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.670523 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99689f2-e449-4a63-aee4-2c22e629616a-catalog-content\") pod \"community-operators-slrfw\" (UID: \"e99689f2-e449-4a63-aee4-2c22e629616a\") " pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.698522 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4b69\" (UniqueName: \"kubernetes.io/projected/e99689f2-e449-4a63-aee4-2c22e629616a-kube-api-access-b4b69\") pod \"community-operators-slrfw\" (UID: \"e99689f2-e449-4a63-aee4-2c22e629616a\") " pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:30 crc kubenswrapper[4628]: I1211 06:11:30.852264 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:31 crc kubenswrapper[4628]: I1211 06:11:31.427080 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:11:31 crc kubenswrapper[4628]: I1211 06:11:31.427449 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:11:31 crc kubenswrapper[4628]: I1211 06:11:31.490639 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-slrfw"] Dec 11 06:11:31 crc kubenswrapper[4628]: I1211 06:11:31.585386 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slrfw" event={"ID":"e99689f2-e449-4a63-aee4-2c22e629616a","Type":"ContainerStarted","Data":"da8840038400900eb1453ebc013cf9232706e3561c75121b8fde53a65cc82e31"} Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.607518 4628 generic.go:334] "Generic (PLEG): container finished" podID="e99689f2-e449-4a63-aee4-2c22e629616a" containerID="41b5a1d79066ec82064898c7f7ccb1cf9c6f57f6c258ba982177059758a03c73" exitCode=0 Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.607795 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slrfw" event={"ID":"e99689f2-e449-4a63-aee4-2c22e629616a","Type":"ContainerDied","Data":"41b5a1d79066ec82064898c7f7ccb1cf9c6f57f6c258ba982177059758a03c73"} Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.610072 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.697767 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x27dk"] Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.704544 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.712597 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x27dk"] Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.819346 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkdmq\" (UniqueName: \"kubernetes.io/projected/f62e33d9-faae-40d6-83cf-c2f288dc6032-kube-api-access-nkdmq\") pod \"redhat-marketplace-x27dk\" (UID: \"f62e33d9-faae-40d6-83cf-c2f288dc6032\") " pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.819532 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f62e33d9-faae-40d6-83cf-c2f288dc6032-catalog-content\") pod \"redhat-marketplace-x27dk\" (UID: \"f62e33d9-faae-40d6-83cf-c2f288dc6032\") " pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.819566 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f62e33d9-faae-40d6-83cf-c2f288dc6032-utilities\") pod \"redhat-marketplace-x27dk\" (UID: \"f62e33d9-faae-40d6-83cf-c2f288dc6032\") " pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.921813 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkdmq\" (UniqueName: \"kubernetes.io/projected/f62e33d9-faae-40d6-83cf-c2f288dc6032-kube-api-access-nkdmq\") pod \"redhat-marketplace-x27dk\" (UID: \"f62e33d9-faae-40d6-83cf-c2f288dc6032\") " pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.922010 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f62e33d9-faae-40d6-83cf-c2f288dc6032-catalog-content\") pod \"redhat-marketplace-x27dk\" (UID: \"f62e33d9-faae-40d6-83cf-c2f288dc6032\") " pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.922044 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f62e33d9-faae-40d6-83cf-c2f288dc6032-utilities\") pod \"redhat-marketplace-x27dk\" (UID: \"f62e33d9-faae-40d6-83cf-c2f288dc6032\") " pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.922539 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f62e33d9-faae-40d6-83cf-c2f288dc6032-utilities\") pod \"redhat-marketplace-x27dk\" (UID: \"f62e33d9-faae-40d6-83cf-c2f288dc6032\") " pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.922560 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f62e33d9-faae-40d6-83cf-c2f288dc6032-catalog-content\") pod \"redhat-marketplace-x27dk\" (UID: \"f62e33d9-faae-40d6-83cf-c2f288dc6032\") " pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:32 crc kubenswrapper[4628]: I1211 06:11:32.946224 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkdmq\" (UniqueName: \"kubernetes.io/projected/f62e33d9-faae-40d6-83cf-c2f288dc6032-kube-api-access-nkdmq\") pod \"redhat-marketplace-x27dk\" (UID: \"f62e33d9-faae-40d6-83cf-c2f288dc6032\") " pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:33 crc kubenswrapper[4628]: I1211 06:11:33.027262 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:33 crc kubenswrapper[4628]: I1211 06:11:33.538488 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x27dk"] Dec 11 06:11:33 crc kubenswrapper[4628]: I1211 06:11:33.617904 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x27dk" event={"ID":"f62e33d9-faae-40d6-83cf-c2f288dc6032","Type":"ContainerStarted","Data":"2cb8e5c873549277f5ece5cabdc1ef3b8ae69623c27033638bb0b9daaf9bc04a"} Dec 11 06:11:34 crc kubenswrapper[4628]: I1211 06:11:34.627731 4628 generic.go:334] "Generic (PLEG): container finished" podID="f62e33d9-faae-40d6-83cf-c2f288dc6032" containerID="58fd6e9e43c9d0de8299b2794ba2d1e2223a3042b0241455cd887aef495c44eb" exitCode=0 Dec 11 06:11:34 crc kubenswrapper[4628]: I1211 06:11:34.628030 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x27dk" event={"ID":"f62e33d9-faae-40d6-83cf-c2f288dc6032","Type":"ContainerDied","Data":"58fd6e9e43c9d0de8299b2794ba2d1e2223a3042b0241455cd887aef495c44eb"} Dec 11 06:11:35 crc kubenswrapper[4628]: I1211 06:11:35.637384 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x27dk" event={"ID":"f62e33d9-faae-40d6-83cf-c2f288dc6032","Type":"ContainerStarted","Data":"3162405cc30f61ae491793c22dca53b4fea740b22a5322d6c3bb24ef79983f19"} Dec 11 06:11:36 crc kubenswrapper[4628]: I1211 06:11:36.648041 4628 generic.go:334] "Generic (PLEG): container finished" podID="f62e33d9-faae-40d6-83cf-c2f288dc6032" containerID="3162405cc30f61ae491793c22dca53b4fea740b22a5322d6c3bb24ef79983f19" exitCode=0 Dec 11 06:11:36 crc kubenswrapper[4628]: I1211 06:11:36.648218 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x27dk" event={"ID":"f62e33d9-faae-40d6-83cf-c2f288dc6032","Type":"ContainerDied","Data":"3162405cc30f61ae491793c22dca53b4fea740b22a5322d6c3bb24ef79983f19"} Dec 11 06:11:40 crc kubenswrapper[4628]: I1211 06:11:40.688936 4628 generic.go:334] "Generic (PLEG): container finished" podID="e99689f2-e449-4a63-aee4-2c22e629616a" containerID="117b70b98da2d692a764f8243163e459d0ccd165d1d6f1c44b270a557672ab02" exitCode=0 Dec 11 06:11:40 crc kubenswrapper[4628]: I1211 06:11:40.689072 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slrfw" event={"ID":"e99689f2-e449-4a63-aee4-2c22e629616a","Type":"ContainerDied","Data":"117b70b98da2d692a764f8243163e459d0ccd165d1d6f1c44b270a557672ab02"} Dec 11 06:11:40 crc kubenswrapper[4628]: I1211 06:11:40.695891 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x27dk" event={"ID":"f62e33d9-faae-40d6-83cf-c2f288dc6032","Type":"ContainerStarted","Data":"6a69385af91f925ba65b18879f5b244ba949836512b9e39af26e96ed8bb244cb"} Dec 11 06:11:40 crc kubenswrapper[4628]: I1211 06:11:40.739490 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x27dk" podStartSLOduration=3.277765095 podStartE2EDuration="8.739470867s" podCreationTimestamp="2025-12-11 06:11:32 +0000 UTC" firstStartedPulling="2025-12-11 06:11:34.630816281 +0000 UTC m=+3397.048162979" lastFinishedPulling="2025-12-11 06:11:40.092522043 +0000 UTC m=+3402.509868751" observedRunningTime="2025-12-11 06:11:40.732078457 +0000 UTC m=+3403.149425155" watchObservedRunningTime="2025-12-11 06:11:40.739470867 +0000 UTC m=+3403.156817565" Dec 11 06:11:41 crc kubenswrapper[4628]: I1211 06:11:41.706930 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-slrfw" event={"ID":"e99689f2-e449-4a63-aee4-2c22e629616a","Type":"ContainerStarted","Data":"7abd6f115018674a5c25a98da73a1f2cb89d6b84684122df0f4ec5a056a2afac"} Dec 11 06:11:41 crc kubenswrapper[4628]: I1211 06:11:41.730316 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-slrfw" podStartSLOduration=2.96227389 podStartE2EDuration="11.730297654s" podCreationTimestamp="2025-12-11 06:11:30 +0000 UTC" firstStartedPulling="2025-12-11 06:11:32.609789023 +0000 UTC m=+3395.027135721" lastFinishedPulling="2025-12-11 06:11:41.377812777 +0000 UTC m=+3403.795159485" observedRunningTime="2025-12-11 06:11:41.73017981 +0000 UTC m=+3404.147526498" watchObservedRunningTime="2025-12-11 06:11:41.730297654 +0000 UTC m=+3404.147644362" Dec 11 06:11:43 crc kubenswrapper[4628]: I1211 06:11:43.028287 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:43 crc kubenswrapper[4628]: I1211 06:11:43.028826 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:43 crc kubenswrapper[4628]: I1211 06:11:43.085818 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:50 crc kubenswrapper[4628]: I1211 06:11:50.853110 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:50 crc kubenswrapper[4628]: I1211 06:11:50.853609 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:50 crc kubenswrapper[4628]: I1211 06:11:50.914344 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:51 crc kubenswrapper[4628]: I1211 06:11:51.868179 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-slrfw" Dec 11 06:11:51 crc kubenswrapper[4628]: I1211 06:11:51.955370 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-slrfw"] Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.006928 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zj2jl"] Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.007224 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zj2jl" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" containerName="registry-server" containerID="cri-o://01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446" gracePeriod=2 Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.799304 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zj2jl" Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.815466 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbnrh\" (UniqueName: \"kubernetes.io/projected/1fd0b2ff-4fee-40e6-b0d7-408c706be731-kube-api-access-bbnrh\") pod \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\" (UID: \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\") " Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.815578 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd0b2ff-4fee-40e6-b0d7-408c706be731-catalog-content\") pod \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\" (UID: \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\") " Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.815621 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd0b2ff-4fee-40e6-b0d7-408c706be731-utilities\") pod \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\" (UID: \"1fd0b2ff-4fee-40e6-b0d7-408c706be731\") " Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.818437 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fd0b2ff-4fee-40e6-b0d7-408c706be731-utilities" (OuterVolumeSpecName: "utilities") pod "1fd0b2ff-4fee-40e6-b0d7-408c706be731" (UID: "1fd0b2ff-4fee-40e6-b0d7-408c706be731"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.835495 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd0b2ff-4fee-40e6-b0d7-408c706be731-kube-api-access-bbnrh" (OuterVolumeSpecName: "kube-api-access-bbnrh") pod "1fd0b2ff-4fee-40e6-b0d7-408c706be731" (UID: "1fd0b2ff-4fee-40e6-b0d7-408c706be731"). InnerVolumeSpecName "kube-api-access-bbnrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.899125 4628 generic.go:334] "Generic (PLEG): container finished" podID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" containerID="01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446" exitCode=0 Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.901342 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zj2jl" Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.902109 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj2jl" event={"ID":"1fd0b2ff-4fee-40e6-b0d7-408c706be731","Type":"ContainerDied","Data":"01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446"} Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.902234 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj2jl" event={"ID":"1fd0b2ff-4fee-40e6-b0d7-408c706be731","Type":"ContainerDied","Data":"e36ff131183baaf6b37ddea4b497f76b59aa24730cd2ad35f6a4ba0790e90594"} Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.902344 4628 scope.go:117] "RemoveContainer" containerID="01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446" Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.918971 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbnrh\" (UniqueName: \"kubernetes.io/projected/1fd0b2ff-4fee-40e6-b0d7-408c706be731-kube-api-access-bbnrh\") on node \"crc\" DevicePath \"\"" Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.923086 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd0b2ff-4fee-40e6-b0d7-408c706be731-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.955178 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fd0b2ff-4fee-40e6-b0d7-408c706be731-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fd0b2ff-4fee-40e6-b0d7-408c706be731" (UID: "1fd0b2ff-4fee-40e6-b0d7-408c706be731"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:11:52 crc kubenswrapper[4628]: I1211 06:11:52.987952 4628 scope.go:117] "RemoveContainer" containerID="d23c426965c0b40df54a03ff376fd87ece8f1df2b1fbbec25b6efa8201250c79" Dec 11 06:11:53 crc kubenswrapper[4628]: I1211 06:11:53.022124 4628 scope.go:117] "RemoveContainer" containerID="80cb6292459b5acb75f4c42a25864d6fc33e898ee203772a368a2c28b9921124" Dec 11 06:11:53 crc kubenswrapper[4628]: I1211 06:11:53.026013 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd0b2ff-4fee-40e6-b0d7-408c706be731-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:11:53 crc kubenswrapper[4628]: I1211 06:11:53.095090 4628 scope.go:117] "RemoveContainer" containerID="01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446" Dec 11 06:11:53 crc kubenswrapper[4628]: E1211 06:11:53.097679 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446\": container with ID starting with 01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446 not found: ID does not exist" containerID="01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446" Dec 11 06:11:53 crc kubenswrapper[4628]: I1211 06:11:53.097727 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446"} err="failed to get container status \"01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446\": rpc error: code = NotFound desc = could not find container \"01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446\": container with ID starting with 01be7683d4796464a297fc0b13b34e4a1cb1918de28170172c832c6e963f1446 not found: ID does not exist" Dec 11 06:11:53 crc kubenswrapper[4628]: I1211 06:11:53.097754 4628 scope.go:117] "RemoveContainer" containerID="d23c426965c0b40df54a03ff376fd87ece8f1df2b1fbbec25b6efa8201250c79" Dec 11 06:11:53 crc kubenswrapper[4628]: E1211 06:11:53.099254 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d23c426965c0b40df54a03ff376fd87ece8f1df2b1fbbec25b6efa8201250c79\": container with ID starting with d23c426965c0b40df54a03ff376fd87ece8f1df2b1fbbec25b6efa8201250c79 not found: ID does not exist" containerID="d23c426965c0b40df54a03ff376fd87ece8f1df2b1fbbec25b6efa8201250c79" Dec 11 06:11:53 crc kubenswrapper[4628]: I1211 06:11:53.099283 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d23c426965c0b40df54a03ff376fd87ece8f1df2b1fbbec25b6efa8201250c79"} err="failed to get container status \"d23c426965c0b40df54a03ff376fd87ece8f1df2b1fbbec25b6efa8201250c79\": rpc error: code = NotFound desc = could not find container \"d23c426965c0b40df54a03ff376fd87ece8f1df2b1fbbec25b6efa8201250c79\": container with ID starting with d23c426965c0b40df54a03ff376fd87ece8f1df2b1fbbec25b6efa8201250c79 not found: ID does not exist" Dec 11 06:11:53 crc kubenswrapper[4628]: I1211 06:11:53.099300 4628 scope.go:117] "RemoveContainer" containerID="80cb6292459b5acb75f4c42a25864d6fc33e898ee203772a368a2c28b9921124" Dec 11 06:11:53 crc kubenswrapper[4628]: E1211 06:11:53.099596 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80cb6292459b5acb75f4c42a25864d6fc33e898ee203772a368a2c28b9921124\": container with ID starting with 80cb6292459b5acb75f4c42a25864d6fc33e898ee203772a368a2c28b9921124 not found: ID does not exist" containerID="80cb6292459b5acb75f4c42a25864d6fc33e898ee203772a368a2c28b9921124" Dec 11 06:11:53 crc kubenswrapper[4628]: I1211 06:11:53.099631 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80cb6292459b5acb75f4c42a25864d6fc33e898ee203772a368a2c28b9921124"} err="failed to get container status \"80cb6292459b5acb75f4c42a25864d6fc33e898ee203772a368a2c28b9921124\": rpc error: code = NotFound desc = could not find container \"80cb6292459b5acb75f4c42a25864d6fc33e898ee203772a368a2c28b9921124\": container with ID starting with 80cb6292459b5acb75f4c42a25864d6fc33e898ee203772a368a2c28b9921124 not found: ID does not exist" Dec 11 06:11:53 crc kubenswrapper[4628]: I1211 06:11:53.115014 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:53 crc kubenswrapper[4628]: I1211 06:11:53.237904 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zj2jl"] Dec 11 06:11:53 crc kubenswrapper[4628]: I1211 06:11:53.245190 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zj2jl"] Dec 11 06:11:53 crc kubenswrapper[4628]: I1211 06:11:53.904983 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" path="/var/lib/kubelet/pods/1fd0b2ff-4fee-40e6-b0d7-408c706be731/volumes" Dec 11 06:11:55 crc kubenswrapper[4628]: I1211 06:11:55.368676 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x27dk"] Dec 11 06:11:55 crc kubenswrapper[4628]: I1211 06:11:55.369194 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x27dk" podUID="f62e33d9-faae-40d6-83cf-c2f288dc6032" containerName="registry-server" containerID="cri-o://6a69385af91f925ba65b18879f5b244ba949836512b9e39af26e96ed8bb244cb" gracePeriod=2 Dec 11 06:11:55 crc kubenswrapper[4628]: I1211 06:11:55.934288 4628 generic.go:334] "Generic (PLEG): container finished" podID="f62e33d9-faae-40d6-83cf-c2f288dc6032" containerID="6a69385af91f925ba65b18879f5b244ba949836512b9e39af26e96ed8bb244cb" exitCode=0 Dec 11 06:11:55 crc kubenswrapper[4628]: I1211 06:11:55.934617 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x27dk" event={"ID":"f62e33d9-faae-40d6-83cf-c2f288dc6032","Type":"ContainerDied","Data":"6a69385af91f925ba65b18879f5b244ba949836512b9e39af26e96ed8bb244cb"} Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.076109 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.204117 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkdmq\" (UniqueName: \"kubernetes.io/projected/f62e33d9-faae-40d6-83cf-c2f288dc6032-kube-api-access-nkdmq\") pod \"f62e33d9-faae-40d6-83cf-c2f288dc6032\" (UID: \"f62e33d9-faae-40d6-83cf-c2f288dc6032\") " Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.204213 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f62e33d9-faae-40d6-83cf-c2f288dc6032-catalog-content\") pod \"f62e33d9-faae-40d6-83cf-c2f288dc6032\" (UID: \"f62e33d9-faae-40d6-83cf-c2f288dc6032\") " Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.204355 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f62e33d9-faae-40d6-83cf-c2f288dc6032-utilities\") pod \"f62e33d9-faae-40d6-83cf-c2f288dc6032\" (UID: \"f62e33d9-faae-40d6-83cf-c2f288dc6032\") " Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.205244 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62e33d9-faae-40d6-83cf-c2f288dc6032-utilities" (OuterVolumeSpecName: "utilities") pod "f62e33d9-faae-40d6-83cf-c2f288dc6032" (UID: "f62e33d9-faae-40d6-83cf-c2f288dc6032"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.219188 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62e33d9-faae-40d6-83cf-c2f288dc6032-kube-api-access-nkdmq" (OuterVolumeSpecName: "kube-api-access-nkdmq") pod "f62e33d9-faae-40d6-83cf-c2f288dc6032" (UID: "f62e33d9-faae-40d6-83cf-c2f288dc6032"). InnerVolumeSpecName "kube-api-access-nkdmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.222606 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62e33d9-faae-40d6-83cf-c2f288dc6032-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f62e33d9-faae-40d6-83cf-c2f288dc6032" (UID: "f62e33d9-faae-40d6-83cf-c2f288dc6032"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.306512 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f62e33d9-faae-40d6-83cf-c2f288dc6032-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.306545 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkdmq\" (UniqueName: \"kubernetes.io/projected/f62e33d9-faae-40d6-83cf-c2f288dc6032-kube-api-access-nkdmq\") on node \"crc\" DevicePath \"\"" Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.306557 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f62e33d9-faae-40d6-83cf-c2f288dc6032-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.948366 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x27dk" event={"ID":"f62e33d9-faae-40d6-83cf-c2f288dc6032","Type":"ContainerDied","Data":"2cb8e5c873549277f5ece5cabdc1ef3b8ae69623c27033638bb0b9daaf9bc04a"} Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.949835 4628 scope.go:117] "RemoveContainer" containerID="6a69385af91f925ba65b18879f5b244ba949836512b9e39af26e96ed8bb244cb" Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.948467 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x27dk" Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.984908 4628 scope.go:117] "RemoveContainer" containerID="3162405cc30f61ae491793c22dca53b4fea740b22a5322d6c3bb24ef79983f19" Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.986370 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x27dk"] Dec 11 06:11:56 crc kubenswrapper[4628]: I1211 06:11:56.999500 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x27dk"] Dec 11 06:11:57 crc kubenswrapper[4628]: I1211 06:11:57.011750 4628 scope.go:117] "RemoveContainer" containerID="58fd6e9e43c9d0de8299b2794ba2d1e2223a3042b0241455cd887aef495c44eb" Dec 11 06:11:57 crc kubenswrapper[4628]: I1211 06:11:57.924541 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f62e33d9-faae-40d6-83cf-c2f288dc6032" path="/var/lib/kubelet/pods/f62e33d9-faae-40d6-83cf-c2f288dc6032/volumes" Dec 11 06:12:01 crc kubenswrapper[4628]: I1211 06:12:01.426551 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:12:01 crc kubenswrapper[4628]: I1211 06:12:01.427170 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:12:01 crc kubenswrapper[4628]: I1211 06:12:01.427223 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 06:12:01 crc kubenswrapper[4628]: I1211 06:12:01.427999 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9d0674039c93a524e570e8ed6bd4cdf2d9c5d8d2cae45af50102adfdc0ac1b0"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 06:12:01 crc kubenswrapper[4628]: I1211 06:12:01.428047 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://a9d0674039c93a524e570e8ed6bd4cdf2d9c5d8d2cae45af50102adfdc0ac1b0" gracePeriod=600 Dec 11 06:12:01 crc kubenswrapper[4628]: I1211 06:12:01.997623 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="a9d0674039c93a524e570e8ed6bd4cdf2d9c5d8d2cae45af50102adfdc0ac1b0" exitCode=0 Dec 11 06:12:01 crc kubenswrapper[4628]: I1211 06:12:01.997819 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"a9d0674039c93a524e570e8ed6bd4cdf2d9c5d8d2cae45af50102adfdc0ac1b0"} Dec 11 06:12:01 crc kubenswrapper[4628]: I1211 06:12:01.998254 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521"} Dec 11 06:12:01 crc kubenswrapper[4628]: I1211 06:12:01.998294 4628 scope.go:117] "RemoveContainer" containerID="650c4b8dde862d00a5ee0555d4cc7031c3e27b99e5651208479d536cbb917944" Dec 11 06:14:01 crc kubenswrapper[4628]: I1211 06:14:01.426612 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:14:01 crc kubenswrapper[4628]: I1211 06:14:01.427222 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.555683 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qk2jm"] Dec 11 06:14:26 crc kubenswrapper[4628]: E1211 06:14:26.556628 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" containerName="registry-server" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.556644 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" containerName="registry-server" Dec 11 06:14:26 crc kubenswrapper[4628]: E1211 06:14:26.556661 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" containerName="extract-content" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.556666 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" containerName="extract-content" Dec 11 06:14:26 crc kubenswrapper[4628]: E1211 06:14:26.556677 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62e33d9-faae-40d6-83cf-c2f288dc6032" containerName="extract-content" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.556683 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62e33d9-faae-40d6-83cf-c2f288dc6032" containerName="extract-content" Dec 11 06:14:26 crc kubenswrapper[4628]: E1211 06:14:26.556694 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" containerName="extract-utilities" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.556700 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" containerName="extract-utilities" Dec 11 06:14:26 crc kubenswrapper[4628]: E1211 06:14:26.556713 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62e33d9-faae-40d6-83cf-c2f288dc6032" containerName="registry-server" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.556719 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62e33d9-faae-40d6-83cf-c2f288dc6032" containerName="registry-server" Dec 11 06:14:26 crc kubenswrapper[4628]: E1211 06:14:26.556731 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62e33d9-faae-40d6-83cf-c2f288dc6032" containerName="extract-utilities" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.556737 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62e33d9-faae-40d6-83cf-c2f288dc6032" containerName="extract-utilities" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.556929 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62e33d9-faae-40d6-83cf-c2f288dc6032" containerName="registry-server" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.556943 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd0b2ff-4fee-40e6-b0d7-408c706be731" containerName="registry-server" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.558331 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.587064 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qk2jm"] Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.643150 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d39934-66d0-4edc-97e0-58863702adf9-catalog-content\") pod \"redhat-operators-qk2jm\" (UID: \"55d39934-66d0-4edc-97e0-58863702adf9\") " pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.643226 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d39934-66d0-4edc-97e0-58863702adf9-utilities\") pod \"redhat-operators-qk2jm\" (UID: \"55d39934-66d0-4edc-97e0-58863702adf9\") " pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.643251 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm9qf\" (UniqueName: \"kubernetes.io/projected/55d39934-66d0-4edc-97e0-58863702adf9-kube-api-access-nm9qf\") pod \"redhat-operators-qk2jm\" (UID: \"55d39934-66d0-4edc-97e0-58863702adf9\") " pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.745132 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d39934-66d0-4edc-97e0-58863702adf9-catalog-content\") pod \"redhat-operators-qk2jm\" (UID: \"55d39934-66d0-4edc-97e0-58863702adf9\") " pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.745212 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d39934-66d0-4edc-97e0-58863702adf9-utilities\") pod \"redhat-operators-qk2jm\" (UID: \"55d39934-66d0-4edc-97e0-58863702adf9\") " pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.745251 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm9qf\" (UniqueName: \"kubernetes.io/projected/55d39934-66d0-4edc-97e0-58863702adf9-kube-api-access-nm9qf\") pod \"redhat-operators-qk2jm\" (UID: \"55d39934-66d0-4edc-97e0-58863702adf9\") " pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.746106 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d39934-66d0-4edc-97e0-58863702adf9-catalog-content\") pod \"redhat-operators-qk2jm\" (UID: \"55d39934-66d0-4edc-97e0-58863702adf9\") " pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.746330 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d39934-66d0-4edc-97e0-58863702adf9-utilities\") pod \"redhat-operators-qk2jm\" (UID: \"55d39934-66d0-4edc-97e0-58863702adf9\") " pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.783056 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm9qf\" (UniqueName: \"kubernetes.io/projected/55d39934-66d0-4edc-97e0-58863702adf9-kube-api-access-nm9qf\") pod \"redhat-operators-qk2jm\" (UID: \"55d39934-66d0-4edc-97e0-58863702adf9\") " pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:26 crc kubenswrapper[4628]: I1211 06:14:26.883945 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:27 crc kubenswrapper[4628]: I1211 06:14:27.408797 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qk2jm"] Dec 11 06:14:28 crc kubenswrapper[4628]: I1211 06:14:28.280643 4628 generic.go:334] "Generic (PLEG): container finished" podID="55d39934-66d0-4edc-97e0-58863702adf9" containerID="e268a28a0a3c5487c0dc0cac3023de9fa895f3940503635d38a45c2a36438595" exitCode=0 Dec 11 06:14:28 crc kubenswrapper[4628]: I1211 06:14:28.281198 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qk2jm" event={"ID":"55d39934-66d0-4edc-97e0-58863702adf9","Type":"ContainerDied","Data":"e268a28a0a3c5487c0dc0cac3023de9fa895f3940503635d38a45c2a36438595"} Dec 11 06:14:28 crc kubenswrapper[4628]: I1211 06:14:28.281233 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qk2jm" event={"ID":"55d39934-66d0-4edc-97e0-58863702adf9","Type":"ContainerStarted","Data":"e198a03537fff353107abde5735409ff7421c88b8be807565ab9845ff61328a8"} Dec 11 06:14:30 crc kubenswrapper[4628]: I1211 06:14:30.300885 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qk2jm" event={"ID":"55d39934-66d0-4edc-97e0-58863702adf9","Type":"ContainerStarted","Data":"b378916816065106db5c72757f6e7729d767cae4dfcd42523647d493283de3a3"} Dec 11 06:14:31 crc kubenswrapper[4628]: I1211 06:14:31.428597 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:14:31 crc kubenswrapper[4628]: I1211 06:14:31.428940 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:14:33 crc kubenswrapper[4628]: I1211 06:14:33.328415 4628 generic.go:334] "Generic (PLEG): container finished" podID="55d39934-66d0-4edc-97e0-58863702adf9" containerID="b378916816065106db5c72757f6e7729d767cae4dfcd42523647d493283de3a3" exitCode=0 Dec 11 06:14:33 crc kubenswrapper[4628]: I1211 06:14:33.328508 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qk2jm" event={"ID":"55d39934-66d0-4edc-97e0-58863702adf9","Type":"ContainerDied","Data":"b378916816065106db5c72757f6e7729d767cae4dfcd42523647d493283de3a3"} Dec 11 06:14:34 crc kubenswrapper[4628]: I1211 06:14:34.340315 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qk2jm" event={"ID":"55d39934-66d0-4edc-97e0-58863702adf9","Type":"ContainerStarted","Data":"5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f"} Dec 11 06:14:34 crc kubenswrapper[4628]: I1211 06:14:34.361746 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qk2jm" podStartSLOduration=2.8425658240000002 podStartE2EDuration="8.361724439s" podCreationTimestamp="2025-12-11 06:14:26 +0000 UTC" firstStartedPulling="2025-12-11 06:14:28.282348914 +0000 UTC m=+3570.699695612" lastFinishedPulling="2025-12-11 06:14:33.801507519 +0000 UTC m=+3576.218854227" observedRunningTime="2025-12-11 06:14:34.359038816 +0000 UTC m=+3576.776385534" watchObservedRunningTime="2025-12-11 06:14:34.361724439 +0000 UTC m=+3576.779071157" Dec 11 06:14:36 crc kubenswrapper[4628]: I1211 06:14:36.884833 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:36 crc kubenswrapper[4628]: I1211 06:14:36.885176 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:37 crc kubenswrapper[4628]: I1211 06:14:37.973493 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qk2jm" podUID="55d39934-66d0-4edc-97e0-58863702adf9" containerName="registry-server" probeResult="failure" output=< Dec 11 06:14:37 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 06:14:37 crc kubenswrapper[4628]: > Dec 11 06:14:46 crc kubenswrapper[4628]: I1211 06:14:46.942440 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:47 crc kubenswrapper[4628]: I1211 06:14:47.014548 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:47 crc kubenswrapper[4628]: I1211 06:14:47.219908 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qk2jm"] Dec 11 06:14:48 crc kubenswrapper[4628]: I1211 06:14:48.455246 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qk2jm" podUID="55d39934-66d0-4edc-97e0-58863702adf9" containerName="registry-server" containerID="cri-o://5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f" gracePeriod=2 Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.112594 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.224918 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d39934-66d0-4edc-97e0-58863702adf9-utilities\") pod \"55d39934-66d0-4edc-97e0-58863702adf9\" (UID: \"55d39934-66d0-4edc-97e0-58863702adf9\") " Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.225183 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm9qf\" (UniqueName: \"kubernetes.io/projected/55d39934-66d0-4edc-97e0-58863702adf9-kube-api-access-nm9qf\") pod \"55d39934-66d0-4edc-97e0-58863702adf9\" (UID: \"55d39934-66d0-4edc-97e0-58863702adf9\") " Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.225220 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d39934-66d0-4edc-97e0-58863702adf9-catalog-content\") pod \"55d39934-66d0-4edc-97e0-58863702adf9\" (UID: \"55d39934-66d0-4edc-97e0-58863702adf9\") " Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.226029 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55d39934-66d0-4edc-97e0-58863702adf9-utilities" (OuterVolumeSpecName: "utilities") pod "55d39934-66d0-4edc-97e0-58863702adf9" (UID: "55d39934-66d0-4edc-97e0-58863702adf9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.233868 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d39934-66d0-4edc-97e0-58863702adf9-kube-api-access-nm9qf" (OuterVolumeSpecName: "kube-api-access-nm9qf") pod "55d39934-66d0-4edc-97e0-58863702adf9" (UID: "55d39934-66d0-4edc-97e0-58863702adf9"). InnerVolumeSpecName "kube-api-access-nm9qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.327532 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm9qf\" (UniqueName: \"kubernetes.io/projected/55d39934-66d0-4edc-97e0-58863702adf9-kube-api-access-nm9qf\") on node \"crc\" DevicePath \"\"" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.327789 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d39934-66d0-4edc-97e0-58863702adf9-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.339370 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55d39934-66d0-4edc-97e0-58863702adf9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55d39934-66d0-4edc-97e0-58863702adf9" (UID: "55d39934-66d0-4edc-97e0-58863702adf9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.429913 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d39934-66d0-4edc-97e0-58863702adf9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.467084 4628 generic.go:334] "Generic (PLEG): container finished" podID="55d39934-66d0-4edc-97e0-58863702adf9" containerID="5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f" exitCode=0 Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.467229 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qk2jm" event={"ID":"55d39934-66d0-4edc-97e0-58863702adf9","Type":"ContainerDied","Data":"5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f"} Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.468015 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qk2jm" event={"ID":"55d39934-66d0-4edc-97e0-58863702adf9","Type":"ContainerDied","Data":"e198a03537fff353107abde5735409ff7421c88b8be807565ab9845ff61328a8"} Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.468048 4628 scope.go:117] "RemoveContainer" containerID="5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.467323 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qk2jm" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.497916 4628 scope.go:117] "RemoveContainer" containerID="b378916816065106db5c72757f6e7729d767cae4dfcd42523647d493283de3a3" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.503178 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qk2jm"] Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.519486 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qk2jm"] Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.522193 4628 scope.go:117] "RemoveContainer" containerID="e268a28a0a3c5487c0dc0cac3023de9fa895f3940503635d38a45c2a36438595" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.567709 4628 scope.go:117] "RemoveContainer" containerID="5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f" Dec 11 06:14:49 crc kubenswrapper[4628]: E1211 06:14:49.568230 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f\": container with ID starting with 5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f not found: ID does not exist" containerID="5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.568261 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f"} err="failed to get container status \"5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f\": rpc error: code = NotFound desc = could not find container \"5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f\": container with ID starting with 5e65aa99e7e8dad5c4691bcae4d8162f0e628102bf15cd042d3a7e433882289f not found: ID does not exist" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.568282 4628 scope.go:117] "RemoveContainer" containerID="b378916816065106db5c72757f6e7729d767cae4dfcd42523647d493283de3a3" Dec 11 06:14:49 crc kubenswrapper[4628]: E1211 06:14:49.568660 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b378916816065106db5c72757f6e7729d767cae4dfcd42523647d493283de3a3\": container with ID starting with b378916816065106db5c72757f6e7729d767cae4dfcd42523647d493283de3a3 not found: ID does not exist" containerID="b378916816065106db5c72757f6e7729d767cae4dfcd42523647d493283de3a3" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.568713 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b378916816065106db5c72757f6e7729d767cae4dfcd42523647d493283de3a3"} err="failed to get container status \"b378916816065106db5c72757f6e7729d767cae4dfcd42523647d493283de3a3\": rpc error: code = NotFound desc = could not find container \"b378916816065106db5c72757f6e7729d767cae4dfcd42523647d493283de3a3\": container with ID starting with b378916816065106db5c72757f6e7729d767cae4dfcd42523647d493283de3a3 not found: ID does not exist" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.568749 4628 scope.go:117] "RemoveContainer" containerID="e268a28a0a3c5487c0dc0cac3023de9fa895f3940503635d38a45c2a36438595" Dec 11 06:14:49 crc kubenswrapper[4628]: E1211 06:14:49.569143 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e268a28a0a3c5487c0dc0cac3023de9fa895f3940503635d38a45c2a36438595\": container with ID starting with e268a28a0a3c5487c0dc0cac3023de9fa895f3940503635d38a45c2a36438595 not found: ID does not exist" containerID="e268a28a0a3c5487c0dc0cac3023de9fa895f3940503635d38a45c2a36438595" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.569179 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e268a28a0a3c5487c0dc0cac3023de9fa895f3940503635d38a45c2a36438595"} err="failed to get container status \"e268a28a0a3c5487c0dc0cac3023de9fa895f3940503635d38a45c2a36438595\": rpc error: code = NotFound desc = could not find container \"e268a28a0a3c5487c0dc0cac3023de9fa895f3940503635d38a45c2a36438595\": container with ID starting with e268a28a0a3c5487c0dc0cac3023de9fa895f3940503635d38a45c2a36438595 not found: ID does not exist" Dec 11 06:14:49 crc kubenswrapper[4628]: I1211 06:14:49.902652 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d39934-66d0-4edc-97e0-58863702adf9" path="/var/lib/kubelet/pods/55d39934-66d0-4edc-97e0-58863702adf9/volumes" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.225197 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7"] Dec 11 06:15:00 crc kubenswrapper[4628]: E1211 06:15:00.226146 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d39934-66d0-4edc-97e0-58863702adf9" containerName="extract-utilities" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.226160 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d39934-66d0-4edc-97e0-58863702adf9" containerName="extract-utilities" Dec 11 06:15:00 crc kubenswrapper[4628]: E1211 06:15:00.226176 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d39934-66d0-4edc-97e0-58863702adf9" containerName="registry-server" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.226182 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d39934-66d0-4edc-97e0-58863702adf9" containerName="registry-server" Dec 11 06:15:00 crc kubenswrapper[4628]: E1211 06:15:00.226191 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d39934-66d0-4edc-97e0-58863702adf9" containerName="extract-content" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.226198 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d39934-66d0-4edc-97e0-58863702adf9" containerName="extract-content" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.226399 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d39934-66d0-4edc-97e0-58863702adf9" containerName="registry-server" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.227209 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.232323 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.232936 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.236330 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7"] Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.340755 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d9320d-519a-4b95-89e5-0abfde9c2267-secret-volume\") pod \"collect-profiles-29423895-gq6k7\" (UID: \"e8d9320d-519a-4b95-89e5-0abfde9c2267\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.340916 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jglfn\" (UniqueName: \"kubernetes.io/projected/e8d9320d-519a-4b95-89e5-0abfde9c2267-kube-api-access-jglfn\") pod \"collect-profiles-29423895-gq6k7\" (UID: \"e8d9320d-519a-4b95-89e5-0abfde9c2267\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.341049 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d9320d-519a-4b95-89e5-0abfde9c2267-config-volume\") pod \"collect-profiles-29423895-gq6k7\" (UID: \"e8d9320d-519a-4b95-89e5-0abfde9c2267\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.442559 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jglfn\" (UniqueName: \"kubernetes.io/projected/e8d9320d-519a-4b95-89e5-0abfde9c2267-kube-api-access-jglfn\") pod \"collect-profiles-29423895-gq6k7\" (UID: \"e8d9320d-519a-4b95-89e5-0abfde9c2267\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.442880 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d9320d-519a-4b95-89e5-0abfde9c2267-config-volume\") pod \"collect-profiles-29423895-gq6k7\" (UID: \"e8d9320d-519a-4b95-89e5-0abfde9c2267\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.443122 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d9320d-519a-4b95-89e5-0abfde9c2267-secret-volume\") pod \"collect-profiles-29423895-gq6k7\" (UID: \"e8d9320d-519a-4b95-89e5-0abfde9c2267\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.443982 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d9320d-519a-4b95-89e5-0abfde9c2267-config-volume\") pod \"collect-profiles-29423895-gq6k7\" (UID: \"e8d9320d-519a-4b95-89e5-0abfde9c2267\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.449077 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d9320d-519a-4b95-89e5-0abfde9c2267-secret-volume\") pod \"collect-profiles-29423895-gq6k7\" (UID: \"e8d9320d-519a-4b95-89e5-0abfde9c2267\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.476139 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jglfn\" (UniqueName: \"kubernetes.io/projected/e8d9320d-519a-4b95-89e5-0abfde9c2267-kube-api-access-jglfn\") pod \"collect-profiles-29423895-gq6k7\" (UID: \"e8d9320d-519a-4b95-89e5-0abfde9c2267\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:00 crc kubenswrapper[4628]: I1211 06:15:00.546162 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.033237 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7"] Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.426484 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.430875 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.431075 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.432037 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.432208 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" gracePeriod=600 Dec 11 06:15:01 crc kubenswrapper[4628]: E1211 06:15:01.552648 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.597451 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" exitCode=0 Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.597558 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521"} Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.597648 4628 scope.go:117] "RemoveContainer" containerID="a9d0674039c93a524e570e8ed6bd4cdf2d9c5d8d2cae45af50102adfdc0ac1b0" Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.598335 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:15:01 crc kubenswrapper[4628]: E1211 06:15:01.598680 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.602540 4628 generic.go:334] "Generic (PLEG): container finished" podID="e8d9320d-519a-4b95-89e5-0abfde9c2267" containerID="4a08bdacd55dce03a6de8979ccb35d1978bd53e7e131fed1c7fc763a05fa1783" exitCode=0 Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.602593 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" event={"ID":"e8d9320d-519a-4b95-89e5-0abfde9c2267","Type":"ContainerDied","Data":"4a08bdacd55dce03a6de8979ccb35d1978bd53e7e131fed1c7fc763a05fa1783"} Dec 11 06:15:01 crc kubenswrapper[4628]: I1211 06:15:01.602623 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" event={"ID":"e8d9320d-519a-4b95-89e5-0abfde9c2267","Type":"ContainerStarted","Data":"6f31f72c4d29d5e9dfc11c999d816a7d9660a0af6bf990d9ff39d465410829e8"} Dec 11 06:15:02 crc kubenswrapper[4628]: I1211 06:15:02.991931 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:03 crc kubenswrapper[4628]: I1211 06:15:03.101805 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d9320d-519a-4b95-89e5-0abfde9c2267-config-volume\") pod \"e8d9320d-519a-4b95-89e5-0abfde9c2267\" (UID: \"e8d9320d-519a-4b95-89e5-0abfde9c2267\") " Dec 11 06:15:03 crc kubenswrapper[4628]: I1211 06:15:03.102185 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jglfn\" (UniqueName: \"kubernetes.io/projected/e8d9320d-519a-4b95-89e5-0abfde9c2267-kube-api-access-jglfn\") pod \"e8d9320d-519a-4b95-89e5-0abfde9c2267\" (UID: \"e8d9320d-519a-4b95-89e5-0abfde9c2267\") " Dec 11 06:15:03 crc kubenswrapper[4628]: I1211 06:15:03.102368 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d9320d-519a-4b95-89e5-0abfde9c2267-secret-volume\") pod \"e8d9320d-519a-4b95-89e5-0abfde9c2267\" (UID: \"e8d9320d-519a-4b95-89e5-0abfde9c2267\") " Dec 11 06:15:03 crc kubenswrapper[4628]: I1211 06:15:03.106081 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8d9320d-519a-4b95-89e5-0abfde9c2267-config-volume" (OuterVolumeSpecName: "config-volume") pod "e8d9320d-519a-4b95-89e5-0abfde9c2267" (UID: "e8d9320d-519a-4b95-89e5-0abfde9c2267"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 06:15:03 crc kubenswrapper[4628]: I1211 06:15:03.110970 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d9320d-519a-4b95-89e5-0abfde9c2267-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e8d9320d-519a-4b95-89e5-0abfde9c2267" (UID: "e8d9320d-519a-4b95-89e5-0abfde9c2267"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:15:03 crc kubenswrapper[4628]: I1211 06:15:03.111242 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d9320d-519a-4b95-89e5-0abfde9c2267-kube-api-access-jglfn" (OuterVolumeSpecName: "kube-api-access-jglfn") pod "e8d9320d-519a-4b95-89e5-0abfde9c2267" (UID: "e8d9320d-519a-4b95-89e5-0abfde9c2267"). InnerVolumeSpecName "kube-api-access-jglfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:15:03 crc kubenswrapper[4628]: I1211 06:15:03.205010 4628 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8d9320d-519a-4b95-89e5-0abfde9c2267-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 06:15:03 crc kubenswrapper[4628]: I1211 06:15:03.205060 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jglfn\" (UniqueName: \"kubernetes.io/projected/e8d9320d-519a-4b95-89e5-0abfde9c2267-kube-api-access-jglfn\") on node \"crc\" DevicePath \"\"" Dec 11 06:15:03 crc kubenswrapper[4628]: I1211 06:15:03.205080 4628 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8d9320d-519a-4b95-89e5-0abfde9c2267-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 06:15:03 crc kubenswrapper[4628]: I1211 06:15:03.622949 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" event={"ID":"e8d9320d-519a-4b95-89e5-0abfde9c2267","Type":"ContainerDied","Data":"6f31f72c4d29d5e9dfc11c999d816a7d9660a0af6bf990d9ff39d465410829e8"} Dec 11 06:15:03 crc kubenswrapper[4628]: I1211 06:15:03.623008 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f31f72c4d29d5e9dfc11c999d816a7d9660a0af6bf990d9ff39d465410829e8" Dec 11 06:15:03 crc kubenswrapper[4628]: I1211 06:15:03.623013 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423895-gq6k7" Dec 11 06:15:04 crc kubenswrapper[4628]: I1211 06:15:04.073201 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9"] Dec 11 06:15:04 crc kubenswrapper[4628]: I1211 06:15:04.081732 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423850-snfj9"] Dec 11 06:15:05 crc kubenswrapper[4628]: I1211 06:15:05.902716 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2008d0e-e957-41d4-946a-c057dfe90bfb" path="/var/lib/kubelet/pods/b2008d0e-e957-41d4-946a-c057dfe90bfb/volumes" Dec 11 06:15:12 crc kubenswrapper[4628]: I1211 06:15:12.889778 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:15:12 crc kubenswrapper[4628]: E1211 06:15:12.891136 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:15:27 crc kubenswrapper[4628]: I1211 06:15:27.890455 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:15:27 crc kubenswrapper[4628]: E1211 06:15:27.893024 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:15:28 crc kubenswrapper[4628]: I1211 06:15:28.888368 4628 scope.go:117] "RemoveContainer" containerID="9b254b0f254fc13cfc8e09ef8d6532083d6a1d6f2b7e816fabf2ca493e14f7d2" Dec 11 06:15:42 crc kubenswrapper[4628]: I1211 06:15:42.889696 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:15:42 crc kubenswrapper[4628]: E1211 06:15:42.891480 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:15:56 crc kubenswrapper[4628]: I1211 06:15:56.889589 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:15:56 crc kubenswrapper[4628]: E1211 06:15:56.890454 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:16:08 crc kubenswrapper[4628]: I1211 06:16:08.889883 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:16:08 crc kubenswrapper[4628]: E1211 06:16:08.891003 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:16:23 crc kubenswrapper[4628]: I1211 06:16:23.889705 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:16:23 crc kubenswrapper[4628]: E1211 06:16:23.890636 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:16:34 crc kubenswrapper[4628]: I1211 06:16:34.889994 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:16:34 crc kubenswrapper[4628]: E1211 06:16:34.891244 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:16:45 crc kubenswrapper[4628]: I1211 06:16:45.889278 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:16:45 crc kubenswrapper[4628]: E1211 06:16:45.890111 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:16:59 crc kubenswrapper[4628]: I1211 06:16:59.890487 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:16:59 crc kubenswrapper[4628]: E1211 06:16:59.891255 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:17:14 crc kubenswrapper[4628]: I1211 06:17:14.889906 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:17:14 crc kubenswrapper[4628]: E1211 06:17:14.891526 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:17:27 crc kubenswrapper[4628]: I1211 06:17:27.897936 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:17:27 crc kubenswrapper[4628]: E1211 06:17:27.898992 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:17:41 crc kubenswrapper[4628]: I1211 06:17:41.890379 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:17:41 crc kubenswrapper[4628]: E1211 06:17:41.891154 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:17:52 crc kubenswrapper[4628]: I1211 06:17:52.891594 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:17:52 crc kubenswrapper[4628]: E1211 06:17:52.892458 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:18:05 crc kubenswrapper[4628]: I1211 06:18:05.897111 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:18:05 crc kubenswrapper[4628]: E1211 06:18:05.897928 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:18:20 crc kubenswrapper[4628]: I1211 06:18:20.889407 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:18:20 crc kubenswrapper[4628]: E1211 06:18:20.890739 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:18:34 crc kubenswrapper[4628]: I1211 06:18:34.889330 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:18:34 crc kubenswrapper[4628]: E1211 06:18:34.890265 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:18:45 crc kubenswrapper[4628]: I1211 06:18:45.890041 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:18:45 crc kubenswrapper[4628]: E1211 06:18:45.893782 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:18:59 crc kubenswrapper[4628]: I1211 06:18:59.890041 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:18:59 crc kubenswrapper[4628]: E1211 06:18:59.890868 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:19:11 crc kubenswrapper[4628]: I1211 06:19:11.890371 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:19:11 crc kubenswrapper[4628]: E1211 06:19:11.891062 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:19:24 crc kubenswrapper[4628]: I1211 06:19:24.891299 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:19:24 crc kubenswrapper[4628]: E1211 06:19:24.892211 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:19:38 crc kubenswrapper[4628]: I1211 06:19:38.889909 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:19:38 crc kubenswrapper[4628]: E1211 06:19:38.890612 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:19:49 crc kubenswrapper[4628]: I1211 06:19:49.889113 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:19:49 crc kubenswrapper[4628]: E1211 06:19:49.889970 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:20:03 crc kubenswrapper[4628]: I1211 06:20:03.889327 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:20:04 crc kubenswrapper[4628]: I1211 06:20:04.335301 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"3f7b78a4d45fc2941babafd07cee8c3222be00b496ea76158561b39fc28da3cd"} Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.000467 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wg8m2"] Dec 11 06:21:05 crc kubenswrapper[4628]: E1211 06:21:05.001465 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d9320d-519a-4b95-89e5-0abfde9c2267" containerName="collect-profiles" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.001494 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d9320d-519a-4b95-89e5-0abfde9c2267" containerName="collect-profiles" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.001753 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d9320d-519a-4b95-89e5-0abfde9c2267" containerName="collect-profiles" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.004248 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.032704 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wg8m2"] Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.134272 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wft9x\" (UniqueName: \"kubernetes.io/projected/49c38217-7f74-447b-a8ab-b7bf727d90e5-kube-api-access-wft9x\") pod \"certified-operators-wg8m2\" (UID: \"49c38217-7f74-447b-a8ab-b7bf727d90e5\") " pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.134409 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49c38217-7f74-447b-a8ab-b7bf727d90e5-catalog-content\") pod \"certified-operators-wg8m2\" (UID: \"49c38217-7f74-447b-a8ab-b7bf727d90e5\") " pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.134482 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49c38217-7f74-447b-a8ab-b7bf727d90e5-utilities\") pod \"certified-operators-wg8m2\" (UID: \"49c38217-7f74-447b-a8ab-b7bf727d90e5\") " pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.236408 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wft9x\" (UniqueName: \"kubernetes.io/projected/49c38217-7f74-447b-a8ab-b7bf727d90e5-kube-api-access-wft9x\") pod \"certified-operators-wg8m2\" (UID: \"49c38217-7f74-447b-a8ab-b7bf727d90e5\") " pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.236544 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49c38217-7f74-447b-a8ab-b7bf727d90e5-catalog-content\") pod \"certified-operators-wg8m2\" (UID: \"49c38217-7f74-447b-a8ab-b7bf727d90e5\") " pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.236616 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49c38217-7f74-447b-a8ab-b7bf727d90e5-utilities\") pod \"certified-operators-wg8m2\" (UID: \"49c38217-7f74-447b-a8ab-b7bf727d90e5\") " pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.237147 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49c38217-7f74-447b-a8ab-b7bf727d90e5-catalog-content\") pod \"certified-operators-wg8m2\" (UID: \"49c38217-7f74-447b-a8ab-b7bf727d90e5\") " pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.237158 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49c38217-7f74-447b-a8ab-b7bf727d90e5-utilities\") pod \"certified-operators-wg8m2\" (UID: \"49c38217-7f74-447b-a8ab-b7bf727d90e5\") " pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.264601 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wft9x\" (UniqueName: \"kubernetes.io/projected/49c38217-7f74-447b-a8ab-b7bf727d90e5-kube-api-access-wft9x\") pod \"certified-operators-wg8m2\" (UID: \"49c38217-7f74-447b-a8ab-b7bf727d90e5\") " pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.326573 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.879922 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wg8m2"] Dec 11 06:21:05 crc kubenswrapper[4628]: I1211 06:21:05.939058 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wg8m2" event={"ID":"49c38217-7f74-447b-a8ab-b7bf727d90e5","Type":"ContainerStarted","Data":"50c694fe48b9247934628d1f2ce0674ebdd7f957597605dfafb0463a61ce51d2"} Dec 11 06:21:06 crc kubenswrapper[4628]: I1211 06:21:06.966859 4628 generic.go:334] "Generic (PLEG): container finished" podID="49c38217-7f74-447b-a8ab-b7bf727d90e5" containerID="fdc912f9509135d41f0fb1dbb68070af9827bc559783da7487c083ea22b3ae71" exitCode=0 Dec 11 06:21:06 crc kubenswrapper[4628]: I1211 06:21:06.967111 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wg8m2" event={"ID":"49c38217-7f74-447b-a8ab-b7bf727d90e5","Type":"ContainerDied","Data":"fdc912f9509135d41f0fb1dbb68070af9827bc559783da7487c083ea22b3ae71"} Dec 11 06:21:06 crc kubenswrapper[4628]: I1211 06:21:06.969312 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 06:21:13 crc kubenswrapper[4628]: I1211 06:21:13.022452 4628 generic.go:334] "Generic (PLEG): container finished" podID="49c38217-7f74-447b-a8ab-b7bf727d90e5" containerID="9e8864c0fc18d36f3923d905027bc2718c4ac21c34f769caf4bca3e3a0589436" exitCode=0 Dec 11 06:21:13 crc kubenswrapper[4628]: I1211 06:21:13.022529 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wg8m2" event={"ID":"49c38217-7f74-447b-a8ab-b7bf727d90e5","Type":"ContainerDied","Data":"9e8864c0fc18d36f3923d905027bc2718c4ac21c34f769caf4bca3e3a0589436"} Dec 11 06:21:14 crc kubenswrapper[4628]: I1211 06:21:14.035198 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wg8m2" event={"ID":"49c38217-7f74-447b-a8ab-b7bf727d90e5","Type":"ContainerStarted","Data":"02cb9760463186551b136905881bf50e486770a0c23c6033befb878e7beb7ce8"} Dec 11 06:21:14 crc kubenswrapper[4628]: I1211 06:21:14.058408 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wg8m2" podStartSLOduration=3.508524045 podStartE2EDuration="10.058393333s" podCreationTimestamp="2025-12-11 06:21:04 +0000 UTC" firstStartedPulling="2025-12-11 06:21:06.969082236 +0000 UTC m=+3969.386428934" lastFinishedPulling="2025-12-11 06:21:13.518951484 +0000 UTC m=+3975.936298222" observedRunningTime="2025-12-11 06:21:14.05572346 +0000 UTC m=+3976.473070158" watchObservedRunningTime="2025-12-11 06:21:14.058393333 +0000 UTC m=+3976.475740031" Dec 11 06:21:15 crc kubenswrapper[4628]: I1211 06:21:15.327302 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:15 crc kubenswrapper[4628]: I1211 06:21:15.327650 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:16 crc kubenswrapper[4628]: I1211 06:21:16.386691 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-wg8m2" podUID="49c38217-7f74-447b-a8ab-b7bf727d90e5" containerName="registry-server" probeResult="failure" output=< Dec 11 06:21:16 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 06:21:16 crc kubenswrapper[4628]: > Dec 11 06:21:25 crc kubenswrapper[4628]: I1211 06:21:25.386703 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:25 crc kubenswrapper[4628]: I1211 06:21:25.450612 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wg8m2" Dec 11 06:21:25 crc kubenswrapper[4628]: I1211 06:21:25.523330 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wg8m2"] Dec 11 06:21:25 crc kubenswrapper[4628]: I1211 06:21:25.633879 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zf9t5"] Dec 11 06:21:25 crc kubenswrapper[4628]: I1211 06:21:25.634176 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zf9t5" podUID="2466dfeb-42c9-4a50-9a80-502565709587" containerName="registry-server" containerID="cri-o://e92cc3a526637b2cddb5e9d9f7076fa10d89ab32cfb573df293e775f12468ea1" gracePeriod=2 Dec 11 06:21:26 crc kubenswrapper[4628]: I1211 06:21:26.142198 4628 generic.go:334] "Generic (PLEG): container finished" podID="2466dfeb-42c9-4a50-9a80-502565709587" containerID="e92cc3a526637b2cddb5e9d9f7076fa10d89ab32cfb573df293e775f12468ea1" exitCode=0 Dec 11 06:21:26 crc kubenswrapper[4628]: I1211 06:21:26.142338 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf9t5" event={"ID":"2466dfeb-42c9-4a50-9a80-502565709587","Type":"ContainerDied","Data":"e92cc3a526637b2cddb5e9d9f7076fa10d89ab32cfb573df293e775f12468ea1"} Dec 11 06:21:26 crc kubenswrapper[4628]: I1211 06:21:26.284802 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:21:26 crc kubenswrapper[4628]: I1211 06:21:26.407170 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2466dfeb-42c9-4a50-9a80-502565709587-utilities\") pod \"2466dfeb-42c9-4a50-9a80-502565709587\" (UID: \"2466dfeb-42c9-4a50-9a80-502565709587\") " Dec 11 06:21:26 crc kubenswrapper[4628]: I1211 06:21:26.408218 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2466dfeb-42c9-4a50-9a80-502565709587-catalog-content\") pod \"2466dfeb-42c9-4a50-9a80-502565709587\" (UID: \"2466dfeb-42c9-4a50-9a80-502565709587\") " Dec 11 06:21:26 crc kubenswrapper[4628]: I1211 06:21:26.407664 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2466dfeb-42c9-4a50-9a80-502565709587-utilities" (OuterVolumeSpecName: "utilities") pod "2466dfeb-42c9-4a50-9a80-502565709587" (UID: "2466dfeb-42c9-4a50-9a80-502565709587"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:21:26 crc kubenswrapper[4628]: I1211 06:21:26.408517 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhjpc\" (UniqueName: \"kubernetes.io/projected/2466dfeb-42c9-4a50-9a80-502565709587-kube-api-access-jhjpc\") pod \"2466dfeb-42c9-4a50-9a80-502565709587\" (UID: \"2466dfeb-42c9-4a50-9a80-502565709587\") " Dec 11 06:21:26 crc kubenswrapper[4628]: I1211 06:21:26.409154 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2466dfeb-42c9-4a50-9a80-502565709587-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:21:26 crc kubenswrapper[4628]: I1211 06:21:26.432012 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2466dfeb-42c9-4a50-9a80-502565709587-kube-api-access-jhjpc" (OuterVolumeSpecName: "kube-api-access-jhjpc") pod "2466dfeb-42c9-4a50-9a80-502565709587" (UID: "2466dfeb-42c9-4a50-9a80-502565709587"). InnerVolumeSpecName "kube-api-access-jhjpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:21:26 crc kubenswrapper[4628]: I1211 06:21:26.460013 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2466dfeb-42c9-4a50-9a80-502565709587-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2466dfeb-42c9-4a50-9a80-502565709587" (UID: "2466dfeb-42c9-4a50-9a80-502565709587"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:21:26 crc kubenswrapper[4628]: I1211 06:21:26.511281 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhjpc\" (UniqueName: \"kubernetes.io/projected/2466dfeb-42c9-4a50-9a80-502565709587-kube-api-access-jhjpc\") on node \"crc\" DevicePath \"\"" Dec 11 06:21:26 crc kubenswrapper[4628]: I1211 06:21:26.511316 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2466dfeb-42c9-4a50-9a80-502565709587-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:21:27 crc kubenswrapper[4628]: I1211 06:21:27.156099 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zf9t5" Dec 11 06:21:27 crc kubenswrapper[4628]: I1211 06:21:27.156299 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zf9t5" event={"ID":"2466dfeb-42c9-4a50-9a80-502565709587","Type":"ContainerDied","Data":"2cd2780dc2b9bffe155950d227b5c69e615ad795091487c015058eab38ee7de6"} Dec 11 06:21:27 crc kubenswrapper[4628]: I1211 06:21:27.157935 4628 scope.go:117] "RemoveContainer" containerID="e92cc3a526637b2cddb5e9d9f7076fa10d89ab32cfb573df293e775f12468ea1" Dec 11 06:21:27 crc kubenswrapper[4628]: I1211 06:21:27.190310 4628 scope.go:117] "RemoveContainer" containerID="fbea465050247fb0fc3af68c69c263005ab286560b44d17b204323580f787fa9" Dec 11 06:21:27 crc kubenswrapper[4628]: I1211 06:21:27.207608 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zf9t5"] Dec 11 06:21:27 crc kubenswrapper[4628]: I1211 06:21:27.219578 4628 scope.go:117] "RemoveContainer" containerID="edf22d08f97d1d6b8191199361d972678f3febfb616e424436293b5e84f91b5e" Dec 11 06:21:27 crc kubenswrapper[4628]: I1211 06:21:27.231620 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zf9t5"] Dec 11 06:21:27 crc kubenswrapper[4628]: I1211 06:21:27.919813 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2466dfeb-42c9-4a50-9a80-502565709587" path="/var/lib/kubelet/pods/2466dfeb-42c9-4a50-9a80-502565709587/volumes" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.644184 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d9c48"] Dec 11 06:21:32 crc kubenswrapper[4628]: E1211 06:21:32.646298 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2466dfeb-42c9-4a50-9a80-502565709587" containerName="extract-utilities" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.646398 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2466dfeb-42c9-4a50-9a80-502565709587" containerName="extract-utilities" Dec 11 06:21:32 crc kubenswrapper[4628]: E1211 06:21:32.646476 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2466dfeb-42c9-4a50-9a80-502565709587" containerName="extract-content" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.646605 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2466dfeb-42c9-4a50-9a80-502565709587" containerName="extract-content" Dec 11 06:21:32 crc kubenswrapper[4628]: E1211 06:21:32.646705 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2466dfeb-42c9-4a50-9a80-502565709587" containerName="registry-server" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.646783 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="2466dfeb-42c9-4a50-9a80-502565709587" containerName="registry-server" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.647190 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="2466dfeb-42c9-4a50-9a80-502565709587" containerName="registry-server" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.648972 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.669902 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9c48"] Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.735982 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdbxw\" (UniqueName: \"kubernetes.io/projected/df44463c-e622-4f99-a203-bbc33d7af21e-kube-api-access-rdbxw\") pod \"redhat-marketplace-d9c48\" (UID: \"df44463c-e622-4f99-a203-bbc33d7af21e\") " pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.736018 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df44463c-e622-4f99-a203-bbc33d7af21e-catalog-content\") pod \"redhat-marketplace-d9c48\" (UID: \"df44463c-e622-4f99-a203-bbc33d7af21e\") " pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.736059 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df44463c-e622-4f99-a203-bbc33d7af21e-utilities\") pod \"redhat-marketplace-d9c48\" (UID: \"df44463c-e622-4f99-a203-bbc33d7af21e\") " pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.838211 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdbxw\" (UniqueName: \"kubernetes.io/projected/df44463c-e622-4f99-a203-bbc33d7af21e-kube-api-access-rdbxw\") pod \"redhat-marketplace-d9c48\" (UID: \"df44463c-e622-4f99-a203-bbc33d7af21e\") " pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.838522 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df44463c-e622-4f99-a203-bbc33d7af21e-catalog-content\") pod \"redhat-marketplace-d9c48\" (UID: \"df44463c-e622-4f99-a203-bbc33d7af21e\") " pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.839034 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df44463c-e622-4f99-a203-bbc33d7af21e-catalog-content\") pod \"redhat-marketplace-d9c48\" (UID: \"df44463c-e622-4f99-a203-bbc33d7af21e\") " pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.839089 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df44463c-e622-4f99-a203-bbc33d7af21e-utilities\") pod \"redhat-marketplace-d9c48\" (UID: \"df44463c-e622-4f99-a203-bbc33d7af21e\") " pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.839332 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df44463c-e622-4f99-a203-bbc33d7af21e-utilities\") pod \"redhat-marketplace-d9c48\" (UID: \"df44463c-e622-4f99-a203-bbc33d7af21e\") " pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.860668 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdbxw\" (UniqueName: \"kubernetes.io/projected/df44463c-e622-4f99-a203-bbc33d7af21e-kube-api-access-rdbxw\") pod \"redhat-marketplace-d9c48\" (UID: \"df44463c-e622-4f99-a203-bbc33d7af21e\") " pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:32 crc kubenswrapper[4628]: I1211 06:21:32.979670 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:33 crc kubenswrapper[4628]: I1211 06:21:33.480025 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9c48"] Dec 11 06:21:34 crc kubenswrapper[4628]: I1211 06:21:34.217810 4628 generic.go:334] "Generic (PLEG): container finished" podID="df44463c-e622-4f99-a203-bbc33d7af21e" containerID="5110d2814b236722143133765ace96c692e5d9b7f7b4425b7ed047f02137d4a8" exitCode=0 Dec 11 06:21:34 crc kubenswrapper[4628]: I1211 06:21:34.217961 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9c48" event={"ID":"df44463c-e622-4f99-a203-bbc33d7af21e","Type":"ContainerDied","Data":"5110d2814b236722143133765ace96c692e5d9b7f7b4425b7ed047f02137d4a8"} Dec 11 06:21:34 crc kubenswrapper[4628]: I1211 06:21:34.218100 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9c48" event={"ID":"df44463c-e622-4f99-a203-bbc33d7af21e","Type":"ContainerStarted","Data":"d6a6b0ddd47da33387af1eedc9069ae5f6ae9d2bfd02d99458a178f79cabf6ae"} Dec 11 06:21:35 crc kubenswrapper[4628]: I1211 06:21:35.231066 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9c48" event={"ID":"df44463c-e622-4f99-a203-bbc33d7af21e","Type":"ContainerStarted","Data":"f63a71d43cb9d0ce62f3f6fcd106b00632afd0d2fd048b8f856bdfe6c156d12f"} Dec 11 06:21:36 crc kubenswrapper[4628]: I1211 06:21:36.246286 4628 generic.go:334] "Generic (PLEG): container finished" podID="df44463c-e622-4f99-a203-bbc33d7af21e" containerID="f63a71d43cb9d0ce62f3f6fcd106b00632afd0d2fd048b8f856bdfe6c156d12f" exitCode=0 Dec 11 06:21:36 crc kubenswrapper[4628]: I1211 06:21:36.246342 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9c48" event={"ID":"df44463c-e622-4f99-a203-bbc33d7af21e","Type":"ContainerDied","Data":"f63a71d43cb9d0ce62f3f6fcd106b00632afd0d2fd048b8f856bdfe6c156d12f"} Dec 11 06:21:37 crc kubenswrapper[4628]: I1211 06:21:37.262055 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9c48" event={"ID":"df44463c-e622-4f99-a203-bbc33d7af21e","Type":"ContainerStarted","Data":"5eaef875521c75516f89e2899434c90e66735bf550078686884eedb1c8a95378"} Dec 11 06:21:38 crc kubenswrapper[4628]: I1211 06:21:38.304309 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d9c48" podStartSLOduration=3.677953526 podStartE2EDuration="6.30428274s" podCreationTimestamp="2025-12-11 06:21:32 +0000 UTC" firstStartedPulling="2025-12-11 06:21:34.219928226 +0000 UTC m=+3996.637274934" lastFinishedPulling="2025-12-11 06:21:36.84625745 +0000 UTC m=+3999.263604148" observedRunningTime="2025-12-11 06:21:38.289383679 +0000 UTC m=+4000.706730377" watchObservedRunningTime="2025-12-11 06:21:38.30428274 +0000 UTC m=+4000.721629438" Dec 11 06:21:42 crc kubenswrapper[4628]: I1211 06:21:42.980529 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:42 crc kubenswrapper[4628]: I1211 06:21:42.980914 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:43 crc kubenswrapper[4628]: I1211 06:21:43.025830 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:43 crc kubenswrapper[4628]: I1211 06:21:43.356808 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:43 crc kubenswrapper[4628]: I1211 06:21:43.407035 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9c48"] Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.339211 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d9c48" podUID="df44463c-e622-4f99-a203-bbc33d7af21e" containerName="registry-server" containerID="cri-o://5eaef875521c75516f89e2899434c90e66735bf550078686884eedb1c8a95378" gracePeriod=2 Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.689168 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rwxm9"] Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.691621 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.724266 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwxm9"] Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.812724 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d378db0-a631-4f38-a46a-8d5e1ddeda86-catalog-content\") pod \"community-operators-rwxm9\" (UID: \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\") " pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.813112 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kqwx\" (UniqueName: \"kubernetes.io/projected/6d378db0-a631-4f38-a46a-8d5e1ddeda86-kube-api-access-2kqwx\") pod \"community-operators-rwxm9\" (UID: \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\") " pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.813392 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d378db0-a631-4f38-a46a-8d5e1ddeda86-utilities\") pod \"community-operators-rwxm9\" (UID: \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\") " pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.915044 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d378db0-a631-4f38-a46a-8d5e1ddeda86-utilities\") pod \"community-operators-rwxm9\" (UID: \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\") " pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.915183 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d378db0-a631-4f38-a46a-8d5e1ddeda86-catalog-content\") pod \"community-operators-rwxm9\" (UID: \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\") " pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.915241 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kqwx\" (UniqueName: \"kubernetes.io/projected/6d378db0-a631-4f38-a46a-8d5e1ddeda86-kube-api-access-2kqwx\") pod \"community-operators-rwxm9\" (UID: \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\") " pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.915803 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d378db0-a631-4f38-a46a-8d5e1ddeda86-catalog-content\") pod \"community-operators-rwxm9\" (UID: \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\") " pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.915812 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d378db0-a631-4f38-a46a-8d5e1ddeda86-utilities\") pod \"community-operators-rwxm9\" (UID: \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\") " pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:45 crc kubenswrapper[4628]: I1211 06:21:45.938446 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kqwx\" (UniqueName: \"kubernetes.io/projected/6d378db0-a631-4f38-a46a-8d5e1ddeda86-kube-api-access-2kqwx\") pod \"community-operators-rwxm9\" (UID: \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\") " pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.012827 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.359651 4628 generic.go:334] "Generic (PLEG): container finished" podID="df44463c-e622-4f99-a203-bbc33d7af21e" containerID="5eaef875521c75516f89e2899434c90e66735bf550078686884eedb1c8a95378" exitCode=0 Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.359898 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9c48" event={"ID":"df44463c-e622-4f99-a203-bbc33d7af21e","Type":"ContainerDied","Data":"5eaef875521c75516f89e2899434c90e66735bf550078686884eedb1c8a95378"} Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.697773 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.738758 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdbxw\" (UniqueName: \"kubernetes.io/projected/df44463c-e622-4f99-a203-bbc33d7af21e-kube-api-access-rdbxw\") pod \"df44463c-e622-4f99-a203-bbc33d7af21e\" (UID: \"df44463c-e622-4f99-a203-bbc33d7af21e\") " Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.738957 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df44463c-e622-4f99-a203-bbc33d7af21e-catalog-content\") pod \"df44463c-e622-4f99-a203-bbc33d7af21e\" (UID: \"df44463c-e622-4f99-a203-bbc33d7af21e\") " Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.739140 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df44463c-e622-4f99-a203-bbc33d7af21e-utilities\") pod \"df44463c-e622-4f99-a203-bbc33d7af21e\" (UID: \"df44463c-e622-4f99-a203-bbc33d7af21e\") " Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.742264 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df44463c-e622-4f99-a203-bbc33d7af21e-utilities" (OuterVolumeSpecName: "utilities") pod "df44463c-e622-4f99-a203-bbc33d7af21e" (UID: "df44463c-e622-4f99-a203-bbc33d7af21e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.756146 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df44463c-e622-4f99-a203-bbc33d7af21e-kube-api-access-rdbxw" (OuterVolumeSpecName: "kube-api-access-rdbxw") pod "df44463c-e622-4f99-a203-bbc33d7af21e" (UID: "df44463c-e622-4f99-a203-bbc33d7af21e"). InnerVolumeSpecName "kube-api-access-rdbxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.806836 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwxm9"] Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.827244 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df44463c-e622-4f99-a203-bbc33d7af21e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df44463c-e622-4f99-a203-bbc33d7af21e" (UID: "df44463c-e622-4f99-a203-bbc33d7af21e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.846979 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df44463c-e622-4f99-a203-bbc33d7af21e-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.847048 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdbxw\" (UniqueName: \"kubernetes.io/projected/df44463c-e622-4f99-a203-bbc33d7af21e-kube-api-access-rdbxw\") on node \"crc\" DevicePath \"\"" Dec 11 06:21:46 crc kubenswrapper[4628]: I1211 06:21:46.847066 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df44463c-e622-4f99-a203-bbc33d7af21e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:21:47 crc kubenswrapper[4628]: I1211 06:21:47.372386 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9c48" event={"ID":"df44463c-e622-4f99-a203-bbc33d7af21e","Type":"ContainerDied","Data":"d6a6b0ddd47da33387af1eedc9069ae5f6ae9d2bfd02d99458a178f79cabf6ae"} Dec 11 06:21:47 crc kubenswrapper[4628]: I1211 06:21:47.372467 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9c48" Dec 11 06:21:47 crc kubenswrapper[4628]: I1211 06:21:47.372722 4628 scope.go:117] "RemoveContainer" containerID="5eaef875521c75516f89e2899434c90e66735bf550078686884eedb1c8a95378" Dec 11 06:21:47 crc kubenswrapper[4628]: I1211 06:21:47.378424 4628 generic.go:334] "Generic (PLEG): container finished" podID="6d378db0-a631-4f38-a46a-8d5e1ddeda86" containerID="013faddf0dd6fae1bd17a921c677b41b27dd82e7a8266f294b8940c24c543e77" exitCode=0 Dec 11 06:21:47 crc kubenswrapper[4628]: I1211 06:21:47.378465 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwxm9" event={"ID":"6d378db0-a631-4f38-a46a-8d5e1ddeda86","Type":"ContainerDied","Data":"013faddf0dd6fae1bd17a921c677b41b27dd82e7a8266f294b8940c24c543e77"} Dec 11 06:21:47 crc kubenswrapper[4628]: I1211 06:21:47.378491 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwxm9" event={"ID":"6d378db0-a631-4f38-a46a-8d5e1ddeda86","Type":"ContainerStarted","Data":"e4bb773e243d18f41322976dae25d48f090e7f62c0d732577cf45240520c515c"} Dec 11 06:21:47 crc kubenswrapper[4628]: I1211 06:21:47.397974 4628 scope.go:117] "RemoveContainer" containerID="f63a71d43cb9d0ce62f3f6fcd106b00632afd0d2fd048b8f856bdfe6c156d12f" Dec 11 06:21:47 crc kubenswrapper[4628]: I1211 06:21:47.419974 4628 scope.go:117] "RemoveContainer" containerID="5110d2814b236722143133765ace96c692e5d9b7f7b4425b7ed047f02137d4a8" Dec 11 06:21:47 crc kubenswrapper[4628]: I1211 06:21:47.454229 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9c48"] Dec 11 06:21:47 crc kubenswrapper[4628]: I1211 06:21:47.464416 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9c48"] Dec 11 06:21:47 crc kubenswrapper[4628]: I1211 06:21:47.911289 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df44463c-e622-4f99-a203-bbc33d7af21e" path="/var/lib/kubelet/pods/df44463c-e622-4f99-a203-bbc33d7af21e/volumes" Dec 11 06:21:48 crc kubenswrapper[4628]: I1211 06:21:48.389985 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwxm9" event={"ID":"6d378db0-a631-4f38-a46a-8d5e1ddeda86","Type":"ContainerStarted","Data":"16e16142ba8f06f36d81b23bda8537dbcbecfab51bec92e952b44e5b72526daf"} Dec 11 06:21:49 crc kubenswrapper[4628]: I1211 06:21:49.404795 4628 generic.go:334] "Generic (PLEG): container finished" podID="6d378db0-a631-4f38-a46a-8d5e1ddeda86" containerID="16e16142ba8f06f36d81b23bda8537dbcbecfab51bec92e952b44e5b72526daf" exitCode=0 Dec 11 06:21:49 crc kubenswrapper[4628]: I1211 06:21:49.405090 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwxm9" event={"ID":"6d378db0-a631-4f38-a46a-8d5e1ddeda86","Type":"ContainerDied","Data":"16e16142ba8f06f36d81b23bda8537dbcbecfab51bec92e952b44e5b72526daf"} Dec 11 06:21:50 crc kubenswrapper[4628]: I1211 06:21:50.417236 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwxm9" event={"ID":"6d378db0-a631-4f38-a46a-8d5e1ddeda86","Type":"ContainerStarted","Data":"2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047"} Dec 11 06:21:50 crc kubenswrapper[4628]: I1211 06:21:50.441015 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rwxm9" podStartSLOduration=2.795791169 podStartE2EDuration="5.440995941s" podCreationTimestamp="2025-12-11 06:21:45 +0000 UTC" firstStartedPulling="2025-12-11 06:21:47.380301095 +0000 UTC m=+4009.797647793" lastFinishedPulling="2025-12-11 06:21:50.025505867 +0000 UTC m=+4012.442852565" observedRunningTime="2025-12-11 06:21:50.434419204 +0000 UTC m=+4012.851765902" watchObservedRunningTime="2025-12-11 06:21:50.440995941 +0000 UTC m=+4012.858342639" Dec 11 06:21:56 crc kubenswrapper[4628]: I1211 06:21:56.014134 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:56 crc kubenswrapper[4628]: I1211 06:21:56.014740 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:56 crc kubenswrapper[4628]: I1211 06:21:56.074089 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:56 crc kubenswrapper[4628]: I1211 06:21:56.513194 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:56 crc kubenswrapper[4628]: I1211 06:21:56.569605 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwxm9"] Dec 11 06:21:58 crc kubenswrapper[4628]: I1211 06:21:58.479802 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rwxm9" podUID="6d378db0-a631-4f38-a46a-8d5e1ddeda86" containerName="registry-server" containerID="cri-o://2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047" gracePeriod=2 Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.177061 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.232143 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d378db0-a631-4f38-a46a-8d5e1ddeda86-catalog-content\") pod \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\" (UID: \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\") " Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.232348 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d378db0-a631-4f38-a46a-8d5e1ddeda86-utilities\") pod \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\" (UID: \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\") " Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.232454 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kqwx\" (UniqueName: \"kubernetes.io/projected/6d378db0-a631-4f38-a46a-8d5e1ddeda86-kube-api-access-2kqwx\") pod \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\" (UID: \"6d378db0-a631-4f38-a46a-8d5e1ddeda86\") " Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.233714 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d378db0-a631-4f38-a46a-8d5e1ddeda86-utilities" (OuterVolumeSpecName: "utilities") pod "6d378db0-a631-4f38-a46a-8d5e1ddeda86" (UID: "6d378db0-a631-4f38-a46a-8d5e1ddeda86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.239720 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d378db0-a631-4f38-a46a-8d5e1ddeda86-kube-api-access-2kqwx" (OuterVolumeSpecName: "kube-api-access-2kqwx") pod "6d378db0-a631-4f38-a46a-8d5e1ddeda86" (UID: "6d378db0-a631-4f38-a46a-8d5e1ddeda86"). InnerVolumeSpecName "kube-api-access-2kqwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.295324 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d378db0-a631-4f38-a46a-8d5e1ddeda86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d378db0-a631-4f38-a46a-8d5e1ddeda86" (UID: "6d378db0-a631-4f38-a46a-8d5e1ddeda86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.334967 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d378db0-a631-4f38-a46a-8d5e1ddeda86-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.335015 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kqwx\" (UniqueName: \"kubernetes.io/projected/6d378db0-a631-4f38-a46a-8d5e1ddeda86-kube-api-access-2kqwx\") on node \"crc\" DevicePath \"\"" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.335030 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d378db0-a631-4f38-a46a-8d5e1ddeda86-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.491644 4628 generic.go:334] "Generic (PLEG): container finished" podID="6d378db0-a631-4f38-a46a-8d5e1ddeda86" containerID="2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047" exitCode=0 Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.491688 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwxm9" event={"ID":"6d378db0-a631-4f38-a46a-8d5e1ddeda86","Type":"ContainerDied","Data":"2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047"} Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.491724 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwxm9" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.491740 4628 scope.go:117] "RemoveContainer" containerID="2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.491727 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwxm9" event={"ID":"6d378db0-a631-4f38-a46a-8d5e1ddeda86","Type":"ContainerDied","Data":"e4bb773e243d18f41322976dae25d48f090e7f62c0d732577cf45240520c515c"} Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.520455 4628 scope.go:117] "RemoveContainer" containerID="16e16142ba8f06f36d81b23bda8537dbcbecfab51bec92e952b44e5b72526daf" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.533794 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwxm9"] Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.544049 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rwxm9"] Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.915830 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d378db0-a631-4f38-a46a-8d5e1ddeda86" path="/var/lib/kubelet/pods/6d378db0-a631-4f38-a46a-8d5e1ddeda86/volumes" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.916695 4628 scope.go:117] "RemoveContainer" containerID="013faddf0dd6fae1bd17a921c677b41b27dd82e7a8266f294b8940c24c543e77" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.976471 4628 scope.go:117] "RemoveContainer" containerID="2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047" Dec 11 06:21:59 crc kubenswrapper[4628]: E1211 06:21:59.977157 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047\": container with ID starting with 2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047 not found: ID does not exist" containerID="2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.977197 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047"} err="failed to get container status \"2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047\": rpc error: code = NotFound desc = could not find container \"2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047\": container with ID starting with 2e017a28fce8bff45d37726ad4e0258a2d8817b81eb1b38eca73995baef04047 not found: ID does not exist" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.977226 4628 scope.go:117] "RemoveContainer" containerID="16e16142ba8f06f36d81b23bda8537dbcbecfab51bec92e952b44e5b72526daf" Dec 11 06:21:59 crc kubenswrapper[4628]: E1211 06:21:59.977423 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16e16142ba8f06f36d81b23bda8537dbcbecfab51bec92e952b44e5b72526daf\": container with ID starting with 16e16142ba8f06f36d81b23bda8537dbcbecfab51bec92e952b44e5b72526daf not found: ID does not exist" containerID="16e16142ba8f06f36d81b23bda8537dbcbecfab51bec92e952b44e5b72526daf" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.977453 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16e16142ba8f06f36d81b23bda8537dbcbecfab51bec92e952b44e5b72526daf"} err="failed to get container status \"16e16142ba8f06f36d81b23bda8537dbcbecfab51bec92e952b44e5b72526daf\": rpc error: code = NotFound desc = could not find container \"16e16142ba8f06f36d81b23bda8537dbcbecfab51bec92e952b44e5b72526daf\": container with ID starting with 16e16142ba8f06f36d81b23bda8537dbcbecfab51bec92e952b44e5b72526daf not found: ID does not exist" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.977472 4628 scope.go:117] "RemoveContainer" containerID="013faddf0dd6fae1bd17a921c677b41b27dd82e7a8266f294b8940c24c543e77" Dec 11 06:21:59 crc kubenswrapper[4628]: E1211 06:21:59.977650 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013faddf0dd6fae1bd17a921c677b41b27dd82e7a8266f294b8940c24c543e77\": container with ID starting with 013faddf0dd6fae1bd17a921c677b41b27dd82e7a8266f294b8940c24c543e77 not found: ID does not exist" containerID="013faddf0dd6fae1bd17a921c677b41b27dd82e7a8266f294b8940c24c543e77" Dec 11 06:21:59 crc kubenswrapper[4628]: I1211 06:21:59.977675 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013faddf0dd6fae1bd17a921c677b41b27dd82e7a8266f294b8940c24c543e77"} err="failed to get container status \"013faddf0dd6fae1bd17a921c677b41b27dd82e7a8266f294b8940c24c543e77\": rpc error: code = NotFound desc = could not find container \"013faddf0dd6fae1bd17a921c677b41b27dd82e7a8266f294b8940c24c543e77\": container with ID starting with 013faddf0dd6fae1bd17a921c677b41b27dd82e7a8266f294b8940c24c543e77 not found: ID does not exist" Dec 11 06:22:31 crc kubenswrapper[4628]: I1211 06:22:31.427470 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:22:31 crc kubenswrapper[4628]: I1211 06:22:31.428172 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:23:01 crc kubenswrapper[4628]: I1211 06:23:01.426524 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:23:01 crc kubenswrapper[4628]: I1211 06:23:01.427004 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:23:31 crc kubenswrapper[4628]: I1211 06:23:31.427314 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:23:31 crc kubenswrapper[4628]: I1211 06:23:31.427888 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:23:31 crc kubenswrapper[4628]: I1211 06:23:31.427932 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 06:23:31 crc kubenswrapper[4628]: I1211 06:23:31.428494 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f7b78a4d45fc2941babafd07cee8c3222be00b496ea76158561b39fc28da3cd"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 06:23:31 crc kubenswrapper[4628]: I1211 06:23:31.428543 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://3f7b78a4d45fc2941babafd07cee8c3222be00b496ea76158561b39fc28da3cd" gracePeriod=600 Dec 11 06:23:32 crc kubenswrapper[4628]: I1211 06:23:32.346302 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="3f7b78a4d45fc2941babafd07cee8c3222be00b496ea76158561b39fc28da3cd" exitCode=0 Dec 11 06:23:32 crc kubenswrapper[4628]: I1211 06:23:32.346435 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"3f7b78a4d45fc2941babafd07cee8c3222be00b496ea76158561b39fc28da3cd"} Dec 11 06:23:32 crc kubenswrapper[4628]: I1211 06:23:32.346821 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa"} Dec 11 06:23:32 crc kubenswrapper[4628]: I1211 06:23:32.346839 4628 scope.go:117] "RemoveContainer" containerID="fee9d45dce7943d023ed284a14d952c57c1b2e8d9adce097defccf1cc9b86521" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.480906 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k76xr"] Dec 11 06:24:37 crc kubenswrapper[4628]: E1211 06:24:37.482089 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d378db0-a631-4f38-a46a-8d5e1ddeda86" containerName="extract-content" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.482105 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d378db0-a631-4f38-a46a-8d5e1ddeda86" containerName="extract-content" Dec 11 06:24:37 crc kubenswrapper[4628]: E1211 06:24:37.482124 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d378db0-a631-4f38-a46a-8d5e1ddeda86" containerName="registry-server" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.482130 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d378db0-a631-4f38-a46a-8d5e1ddeda86" containerName="registry-server" Dec 11 06:24:37 crc kubenswrapper[4628]: E1211 06:24:37.482150 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df44463c-e622-4f99-a203-bbc33d7af21e" containerName="extract-content" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.482158 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="df44463c-e622-4f99-a203-bbc33d7af21e" containerName="extract-content" Dec 11 06:24:37 crc kubenswrapper[4628]: E1211 06:24:37.482171 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df44463c-e622-4f99-a203-bbc33d7af21e" containerName="extract-utilities" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.482177 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="df44463c-e622-4f99-a203-bbc33d7af21e" containerName="extract-utilities" Dec 11 06:24:37 crc kubenswrapper[4628]: E1211 06:24:37.482188 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df44463c-e622-4f99-a203-bbc33d7af21e" containerName="registry-server" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.482194 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="df44463c-e622-4f99-a203-bbc33d7af21e" containerName="registry-server" Dec 11 06:24:37 crc kubenswrapper[4628]: E1211 06:24:37.482216 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d378db0-a631-4f38-a46a-8d5e1ddeda86" containerName="extract-utilities" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.482222 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d378db0-a631-4f38-a46a-8d5e1ddeda86" containerName="extract-utilities" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.482390 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d378db0-a631-4f38-a46a-8d5e1ddeda86" containerName="registry-server" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.482416 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="df44463c-e622-4f99-a203-bbc33d7af21e" containerName="registry-server" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.484008 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.498463 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k76xr"] Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.550929 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16feb168-2126-4fd0-8fe1-557348548ced-utilities\") pod \"redhat-operators-k76xr\" (UID: \"16feb168-2126-4fd0-8fe1-557348548ced\") " pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.551180 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16feb168-2126-4fd0-8fe1-557348548ced-catalog-content\") pod \"redhat-operators-k76xr\" (UID: \"16feb168-2126-4fd0-8fe1-557348548ced\") " pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.551276 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8jgs\" (UniqueName: \"kubernetes.io/projected/16feb168-2126-4fd0-8fe1-557348548ced-kube-api-access-v8jgs\") pod \"redhat-operators-k76xr\" (UID: \"16feb168-2126-4fd0-8fe1-557348548ced\") " pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.652516 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16feb168-2126-4fd0-8fe1-557348548ced-utilities\") pod \"redhat-operators-k76xr\" (UID: \"16feb168-2126-4fd0-8fe1-557348548ced\") " pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.652886 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16feb168-2126-4fd0-8fe1-557348548ced-catalog-content\") pod \"redhat-operators-k76xr\" (UID: \"16feb168-2126-4fd0-8fe1-557348548ced\") " pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.653034 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8jgs\" (UniqueName: \"kubernetes.io/projected/16feb168-2126-4fd0-8fe1-557348548ced-kube-api-access-v8jgs\") pod \"redhat-operators-k76xr\" (UID: \"16feb168-2126-4fd0-8fe1-557348548ced\") " pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.653453 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16feb168-2126-4fd0-8fe1-557348548ced-utilities\") pod \"redhat-operators-k76xr\" (UID: \"16feb168-2126-4fd0-8fe1-557348548ced\") " pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.653758 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16feb168-2126-4fd0-8fe1-557348548ced-catalog-content\") pod \"redhat-operators-k76xr\" (UID: \"16feb168-2126-4fd0-8fe1-557348548ced\") " pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.676431 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8jgs\" (UniqueName: \"kubernetes.io/projected/16feb168-2126-4fd0-8fe1-557348548ced-kube-api-access-v8jgs\") pod \"redhat-operators-k76xr\" (UID: \"16feb168-2126-4fd0-8fe1-557348548ced\") " pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:37 crc kubenswrapper[4628]: I1211 06:24:37.819735 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:38 crc kubenswrapper[4628]: I1211 06:24:38.370317 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k76xr"] Dec 11 06:24:38 crc kubenswrapper[4628]: I1211 06:24:38.409062 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k76xr" event={"ID":"16feb168-2126-4fd0-8fe1-557348548ced","Type":"ContainerStarted","Data":"ade092fb3a792be7e88b4adff7ef5589cefb2340740ec7e402348b5bfead85e3"} Dec 11 06:24:39 crc kubenswrapper[4628]: I1211 06:24:39.417561 4628 generic.go:334] "Generic (PLEG): container finished" podID="16feb168-2126-4fd0-8fe1-557348548ced" containerID="5b496fb92a48efe5c1b8811b4b0f983eb455a7f63cc9d37ed3c6f5991983bfb3" exitCode=0 Dec 11 06:24:39 crc kubenswrapper[4628]: I1211 06:24:39.417615 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k76xr" event={"ID":"16feb168-2126-4fd0-8fe1-557348548ced","Type":"ContainerDied","Data":"5b496fb92a48efe5c1b8811b4b0f983eb455a7f63cc9d37ed3c6f5991983bfb3"} Dec 11 06:24:41 crc kubenswrapper[4628]: I1211 06:24:41.455162 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k76xr" event={"ID":"16feb168-2126-4fd0-8fe1-557348548ced","Type":"ContainerStarted","Data":"0d0c6683854e8f68a4111e8169186af64d667b99fc553320745e8984ab83de72"} Dec 11 06:24:44 crc kubenswrapper[4628]: I1211 06:24:44.479166 4628 generic.go:334] "Generic (PLEG): container finished" podID="16feb168-2126-4fd0-8fe1-557348548ced" containerID="0d0c6683854e8f68a4111e8169186af64d667b99fc553320745e8984ab83de72" exitCode=0 Dec 11 06:24:44 crc kubenswrapper[4628]: I1211 06:24:44.479250 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k76xr" event={"ID":"16feb168-2126-4fd0-8fe1-557348548ced","Type":"ContainerDied","Data":"0d0c6683854e8f68a4111e8169186af64d667b99fc553320745e8984ab83de72"} Dec 11 06:24:45 crc kubenswrapper[4628]: I1211 06:24:45.489208 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k76xr" event={"ID":"16feb168-2126-4fd0-8fe1-557348548ced","Type":"ContainerStarted","Data":"50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7"} Dec 11 06:24:45 crc kubenswrapper[4628]: I1211 06:24:45.511725 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k76xr" podStartSLOduration=2.819349087 podStartE2EDuration="8.511703529s" podCreationTimestamp="2025-12-11 06:24:37 +0000 UTC" firstStartedPulling="2025-12-11 06:24:39.419975538 +0000 UTC m=+4181.837322236" lastFinishedPulling="2025-12-11 06:24:45.11232998 +0000 UTC m=+4187.529676678" observedRunningTime="2025-12-11 06:24:45.509718105 +0000 UTC m=+4187.927064803" watchObservedRunningTime="2025-12-11 06:24:45.511703529 +0000 UTC m=+4187.929050227" Dec 11 06:24:47 crc kubenswrapper[4628]: I1211 06:24:47.819965 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:47 crc kubenswrapper[4628]: I1211 06:24:47.820010 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:48 crc kubenswrapper[4628]: I1211 06:24:48.883774 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k76xr" podUID="16feb168-2126-4fd0-8fe1-557348548ced" containerName="registry-server" probeResult="failure" output=< Dec 11 06:24:48 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 06:24:48 crc kubenswrapper[4628]: > Dec 11 06:24:57 crc kubenswrapper[4628]: I1211 06:24:57.923045 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:58 crc kubenswrapper[4628]: I1211 06:24:58.040678 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:24:58 crc kubenswrapper[4628]: I1211 06:24:58.176765 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k76xr"] Dec 11 06:24:59 crc kubenswrapper[4628]: I1211 06:24:59.624434 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k76xr" podUID="16feb168-2126-4fd0-8fe1-557348548ced" containerName="registry-server" containerID="cri-o://50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7" gracePeriod=2 Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.150078 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.214940 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16feb168-2126-4fd0-8fe1-557348548ced-utilities\") pod \"16feb168-2126-4fd0-8fe1-557348548ced\" (UID: \"16feb168-2126-4fd0-8fe1-557348548ced\") " Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.215135 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8jgs\" (UniqueName: \"kubernetes.io/projected/16feb168-2126-4fd0-8fe1-557348548ced-kube-api-access-v8jgs\") pod \"16feb168-2126-4fd0-8fe1-557348548ced\" (UID: \"16feb168-2126-4fd0-8fe1-557348548ced\") " Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.215165 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16feb168-2126-4fd0-8fe1-557348548ced-catalog-content\") pod \"16feb168-2126-4fd0-8fe1-557348548ced\" (UID: \"16feb168-2126-4fd0-8fe1-557348548ced\") " Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.216673 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16feb168-2126-4fd0-8fe1-557348548ced-utilities" (OuterVolumeSpecName: "utilities") pod "16feb168-2126-4fd0-8fe1-557348548ced" (UID: "16feb168-2126-4fd0-8fe1-557348548ced"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.243082 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16feb168-2126-4fd0-8fe1-557348548ced-kube-api-access-v8jgs" (OuterVolumeSpecName: "kube-api-access-v8jgs") pod "16feb168-2126-4fd0-8fe1-557348548ced" (UID: "16feb168-2126-4fd0-8fe1-557348548ced"). InnerVolumeSpecName "kube-api-access-v8jgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.317641 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16feb168-2126-4fd0-8fe1-557348548ced-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.317669 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8jgs\" (UniqueName: \"kubernetes.io/projected/16feb168-2126-4fd0-8fe1-557348548ced-kube-api-access-v8jgs\") on node \"crc\" DevicePath \"\"" Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.334814 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16feb168-2126-4fd0-8fe1-557348548ced-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16feb168-2126-4fd0-8fe1-557348548ced" (UID: "16feb168-2126-4fd0-8fe1-557348548ced"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.419397 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16feb168-2126-4fd0-8fe1-557348548ced-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.633719 4628 generic.go:334] "Generic (PLEG): container finished" podID="16feb168-2126-4fd0-8fe1-557348548ced" containerID="50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7" exitCode=0 Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.633774 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k76xr" event={"ID":"16feb168-2126-4fd0-8fe1-557348548ced","Type":"ContainerDied","Data":"50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7"} Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.633814 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k76xr" event={"ID":"16feb168-2126-4fd0-8fe1-557348548ced","Type":"ContainerDied","Data":"ade092fb3a792be7e88b4adff7ef5589cefb2340740ec7e402348b5bfead85e3"} Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.633835 4628 scope.go:117] "RemoveContainer" containerID="50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7" Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.635116 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k76xr" Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.656609 4628 scope.go:117] "RemoveContainer" containerID="0d0c6683854e8f68a4111e8169186af64d667b99fc553320745e8984ab83de72" Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.678474 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k76xr"] Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.693230 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k76xr"] Dec 11 06:25:00 crc kubenswrapper[4628]: I1211 06:25:00.695788 4628 scope.go:117] "RemoveContainer" containerID="5b496fb92a48efe5c1b8811b4b0f983eb455a7f63cc9d37ed3c6f5991983bfb3" Dec 11 06:25:01 crc kubenswrapper[4628]: I1211 06:25:01.359519 4628 scope.go:117] "RemoveContainer" containerID="50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7" Dec 11 06:25:01 crc kubenswrapper[4628]: E1211 06:25:01.360445 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7\": container with ID starting with 50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7 not found: ID does not exist" containerID="50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7" Dec 11 06:25:01 crc kubenswrapper[4628]: I1211 06:25:01.360509 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7"} err="failed to get container status \"50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7\": rpc error: code = NotFound desc = could not find container \"50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7\": container with ID starting with 50c41924b52efb992ce00b907ed05725ffff4b3995ea774921f96616eab090c7 not found: ID does not exist" Dec 11 06:25:01 crc kubenswrapper[4628]: I1211 06:25:01.360546 4628 scope.go:117] "RemoveContainer" containerID="0d0c6683854e8f68a4111e8169186af64d667b99fc553320745e8984ab83de72" Dec 11 06:25:01 crc kubenswrapper[4628]: E1211 06:25:01.360975 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d0c6683854e8f68a4111e8169186af64d667b99fc553320745e8984ab83de72\": container with ID starting with 0d0c6683854e8f68a4111e8169186af64d667b99fc553320745e8984ab83de72 not found: ID does not exist" containerID="0d0c6683854e8f68a4111e8169186af64d667b99fc553320745e8984ab83de72" Dec 11 06:25:01 crc kubenswrapper[4628]: I1211 06:25:01.361017 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d0c6683854e8f68a4111e8169186af64d667b99fc553320745e8984ab83de72"} err="failed to get container status \"0d0c6683854e8f68a4111e8169186af64d667b99fc553320745e8984ab83de72\": rpc error: code = NotFound desc = could not find container \"0d0c6683854e8f68a4111e8169186af64d667b99fc553320745e8984ab83de72\": container with ID starting with 0d0c6683854e8f68a4111e8169186af64d667b99fc553320745e8984ab83de72 not found: ID does not exist" Dec 11 06:25:01 crc kubenswrapper[4628]: I1211 06:25:01.361041 4628 scope.go:117] "RemoveContainer" containerID="5b496fb92a48efe5c1b8811b4b0f983eb455a7f63cc9d37ed3c6f5991983bfb3" Dec 11 06:25:01 crc kubenswrapper[4628]: E1211 06:25:01.361352 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b496fb92a48efe5c1b8811b4b0f983eb455a7f63cc9d37ed3c6f5991983bfb3\": container with ID starting with 5b496fb92a48efe5c1b8811b4b0f983eb455a7f63cc9d37ed3c6f5991983bfb3 not found: ID does not exist" containerID="5b496fb92a48efe5c1b8811b4b0f983eb455a7f63cc9d37ed3c6f5991983bfb3" Dec 11 06:25:01 crc kubenswrapper[4628]: I1211 06:25:01.361386 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b496fb92a48efe5c1b8811b4b0f983eb455a7f63cc9d37ed3c6f5991983bfb3"} err="failed to get container status \"5b496fb92a48efe5c1b8811b4b0f983eb455a7f63cc9d37ed3c6f5991983bfb3\": rpc error: code = NotFound desc = could not find container \"5b496fb92a48efe5c1b8811b4b0f983eb455a7f63cc9d37ed3c6f5991983bfb3\": container with ID starting with 5b496fb92a48efe5c1b8811b4b0f983eb455a7f63cc9d37ed3c6f5991983bfb3 not found: ID does not exist" Dec 11 06:25:01 crc kubenswrapper[4628]: I1211 06:25:01.900805 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16feb168-2126-4fd0-8fe1-557348548ced" path="/var/lib/kubelet/pods/16feb168-2126-4fd0-8fe1-557348548ced/volumes" Dec 11 06:25:09 crc kubenswrapper[4628]: I1211 06:25:09.715344 4628 generic.go:334] "Generic (PLEG): container finished" podID="bdf75bdb-5535-4134-b9aa-f094e9e220fc" containerID="37d9c640ee63f39146a5c38040546dd6c6400b496a038d904e4c5b5ef0d04467" exitCode=0 Dec 11 06:25:09 crc kubenswrapper[4628]: I1211 06:25:09.715432 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bdf75bdb-5535-4134-b9aa-f094e9e220fc","Type":"ContainerDied","Data":"37d9c640ee63f39146a5c38040546dd6c6400b496a038d904e4c5b5ef0d04467"} Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.709769 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.733946 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bdf75bdb-5535-4134-b9aa-f094e9e220fc","Type":"ContainerDied","Data":"e6686896659ba93dfb262c5e6243e2cd19f94b82898e69da4bb44fd84472f67d"} Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.733992 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6686896659ba93dfb262c5e6243e2cd19f94b82898e69da4bb44fd84472f67d" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.733998 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.832286 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf75bdb-5535-4134-b9aa-f094e9e220fc-openstack-config\") pod \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.832319 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.832387 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-ca-certs\") pod \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.832417 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bdf75bdb-5535-4134-b9aa-f094e9e220fc-test-operator-ephemeral-temporary\") pod \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.832453 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-openstack-config-secret\") pod \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.832484 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85r47\" (UniqueName: \"kubernetes.io/projected/bdf75bdb-5535-4134-b9aa-f094e9e220fc-kube-api-access-85r47\") pod \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.832503 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-ssh-key\") pod \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.832527 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bdf75bdb-5535-4134-b9aa-f094e9e220fc-test-operator-ephemeral-workdir\") pod \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.832666 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdf75bdb-5535-4134-b9aa-f094e9e220fc-config-data\") pod \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\" (UID: \"bdf75bdb-5535-4134-b9aa-f094e9e220fc\") " Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.833464 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdf75bdb-5535-4134-b9aa-f094e9e220fc-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "bdf75bdb-5535-4134-b9aa-f094e9e220fc" (UID: "bdf75bdb-5535-4134-b9aa-f094e9e220fc"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.833951 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf75bdb-5535-4134-b9aa-f094e9e220fc-config-data" (OuterVolumeSpecName: "config-data") pod "bdf75bdb-5535-4134-b9aa-f094e9e220fc" (UID: "bdf75bdb-5535-4134-b9aa-f094e9e220fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.838938 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdf75bdb-5535-4134-b9aa-f094e9e220fc-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "bdf75bdb-5535-4134-b9aa-f094e9e220fc" (UID: "bdf75bdb-5535-4134-b9aa-f094e9e220fc"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.894243 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdf75bdb-5535-4134-b9aa-f094e9e220fc-kube-api-access-85r47" (OuterVolumeSpecName: "kube-api-access-85r47") pod "bdf75bdb-5535-4134-b9aa-f094e9e220fc" (UID: "bdf75bdb-5535-4134-b9aa-f094e9e220fc"). InnerVolumeSpecName "kube-api-access-85r47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.895752 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "bdf75bdb-5535-4134-b9aa-f094e9e220fc" (UID: "bdf75bdb-5535-4134-b9aa-f094e9e220fc"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.900628 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bdf75bdb-5535-4134-b9aa-f094e9e220fc" (UID: "bdf75bdb-5535-4134-b9aa-f094e9e220fc"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.906301 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "bdf75bdb-5535-4134-b9aa-f094e9e220fc" (UID: "bdf75bdb-5535-4134-b9aa-f094e9e220fc"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.913473 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bdf75bdb-5535-4134-b9aa-f094e9e220fc" (UID: "bdf75bdb-5535-4134-b9aa-f094e9e220fc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.928402 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdf75bdb-5535-4134-b9aa-f094e9e220fc-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "bdf75bdb-5535-4134-b9aa-f094e9e220fc" (UID: "bdf75bdb-5535-4134-b9aa-f094e9e220fc"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.934887 4628 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdf75bdb-5535-4134-b9aa-f094e9e220fc-config-data\") on node \"crc\" DevicePath \"\"" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.935211 4628 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bdf75bdb-5535-4134-b9aa-f094e9e220fc-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.935249 4628 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.935281 4628 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.935291 4628 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bdf75bdb-5535-4134-b9aa-f094e9e220fc-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.935300 4628 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.935309 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85r47\" (UniqueName: \"kubernetes.io/projected/bdf75bdb-5535-4134-b9aa-f094e9e220fc-kube-api-access-85r47\") on node \"crc\" DevicePath \"\"" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.935317 4628 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bdf75bdb-5535-4134-b9aa-f094e9e220fc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.935325 4628 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bdf75bdb-5535-4134-b9aa-f094e9e220fc-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 11 06:25:11 crc kubenswrapper[4628]: I1211 06:25:11.955280 4628 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 11 06:25:12 crc kubenswrapper[4628]: I1211 06:25:12.037777 4628 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 11 06:25:20 crc kubenswrapper[4628]: I1211 06:25:20.860820 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 11 06:25:20 crc kubenswrapper[4628]: E1211 06:25:20.862728 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16feb168-2126-4fd0-8fe1-557348548ced" containerName="extract-content" Dec 11 06:25:20 crc kubenswrapper[4628]: I1211 06:25:20.862813 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="16feb168-2126-4fd0-8fe1-557348548ced" containerName="extract-content" Dec 11 06:25:20 crc kubenswrapper[4628]: E1211 06:25:20.862924 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf75bdb-5535-4134-b9aa-f094e9e220fc" containerName="tempest-tests-tempest-tests-runner" Dec 11 06:25:20 crc kubenswrapper[4628]: I1211 06:25:20.863003 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf75bdb-5535-4134-b9aa-f094e9e220fc" containerName="tempest-tests-tempest-tests-runner" Dec 11 06:25:20 crc kubenswrapper[4628]: E1211 06:25:20.863082 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16feb168-2126-4fd0-8fe1-557348548ced" containerName="extract-utilities" Dec 11 06:25:20 crc kubenswrapper[4628]: I1211 06:25:20.863136 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="16feb168-2126-4fd0-8fe1-557348548ced" containerName="extract-utilities" Dec 11 06:25:20 crc kubenswrapper[4628]: E1211 06:25:20.863199 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16feb168-2126-4fd0-8fe1-557348548ced" containerName="registry-server" Dec 11 06:25:20 crc kubenswrapper[4628]: I1211 06:25:20.863250 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="16feb168-2126-4fd0-8fe1-557348548ced" containerName="registry-server" Dec 11 06:25:20 crc kubenswrapper[4628]: I1211 06:25:20.863485 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="16feb168-2126-4fd0-8fe1-557348548ced" containerName="registry-server" Dec 11 06:25:20 crc kubenswrapper[4628]: I1211 06:25:20.863560 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf75bdb-5535-4134-b9aa-f094e9e220fc" containerName="tempest-tests-tempest-tests-runner" Dec 11 06:25:20 crc kubenswrapper[4628]: I1211 06:25:20.864874 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 06:25:20 crc kubenswrapper[4628]: I1211 06:25:20.868258 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qq5gj" Dec 11 06:25:20 crc kubenswrapper[4628]: I1211 06:25:20.876037 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 11 06:25:21 crc kubenswrapper[4628]: I1211 06:25:21.018813 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4072e36d-49c0-40a9-93d2-5700ef264f8b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 06:25:21 crc kubenswrapper[4628]: I1211 06:25:21.019126 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz4q6\" (UniqueName: \"kubernetes.io/projected/4072e36d-49c0-40a9-93d2-5700ef264f8b-kube-api-access-bz4q6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4072e36d-49c0-40a9-93d2-5700ef264f8b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 06:25:21 crc kubenswrapper[4628]: I1211 06:25:21.120362 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz4q6\" (UniqueName: \"kubernetes.io/projected/4072e36d-49c0-40a9-93d2-5700ef264f8b-kube-api-access-bz4q6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4072e36d-49c0-40a9-93d2-5700ef264f8b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 06:25:21 crc kubenswrapper[4628]: I1211 06:25:21.120422 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4072e36d-49c0-40a9-93d2-5700ef264f8b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 06:25:21 crc kubenswrapper[4628]: I1211 06:25:21.121412 4628 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4072e36d-49c0-40a9-93d2-5700ef264f8b\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 06:25:21 crc kubenswrapper[4628]: I1211 06:25:21.147909 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz4q6\" (UniqueName: \"kubernetes.io/projected/4072e36d-49c0-40a9-93d2-5700ef264f8b-kube-api-access-bz4q6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4072e36d-49c0-40a9-93d2-5700ef264f8b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 06:25:21 crc kubenswrapper[4628]: I1211 06:25:21.161530 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"4072e36d-49c0-40a9-93d2-5700ef264f8b\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 06:25:21 crc kubenswrapper[4628]: I1211 06:25:21.188759 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 11 06:25:21 crc kubenswrapper[4628]: I1211 06:25:21.663251 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 11 06:25:21 crc kubenswrapper[4628]: I1211 06:25:21.821636 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4072e36d-49c0-40a9-93d2-5700ef264f8b","Type":"ContainerStarted","Data":"95ae1d5f380085c4f74f00853ea95ddf4001dc0028976887a5e8cc2016952841"} Dec 11 06:25:23 crc kubenswrapper[4628]: I1211 06:25:23.841421 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"4072e36d-49c0-40a9-93d2-5700ef264f8b","Type":"ContainerStarted","Data":"7d4cc789f7cea9b34fab7a8e90ac48a56ee7bdca93f63257077b23763a7e84d0"} Dec 11 06:25:23 crc kubenswrapper[4628]: I1211 06:25:23.862425 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.818262787 podStartE2EDuration="3.862404597s" podCreationTimestamp="2025-12-11 06:25:20 +0000 UTC" firstStartedPulling="2025-12-11 06:25:21.668833344 +0000 UTC m=+4224.086180062" lastFinishedPulling="2025-12-11 06:25:22.712975174 +0000 UTC m=+4225.130321872" observedRunningTime="2025-12-11 06:25:23.857316429 +0000 UTC m=+4226.274663197" watchObservedRunningTime="2025-12-11 06:25:23.862404597 +0000 UTC m=+4226.279751295" Dec 11 06:25:31 crc kubenswrapper[4628]: I1211 06:25:31.426491 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:25:31 crc kubenswrapper[4628]: I1211 06:25:31.427399 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.419001 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7dhv/must-gather-k96rn"] Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.421811 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/must-gather-k96rn" Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.435331 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x7dhv"/"openshift-service-ca.crt" Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.435836 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x7dhv"/"kube-root-ca.crt" Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.436308 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-x7dhv"/"default-dockercfg-m26m9" Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.442273 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x7dhv/must-gather-k96rn"] Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.536895 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/166f5e85-d96f-490c-9606-f7df3d47809c-must-gather-output\") pod \"must-gather-k96rn\" (UID: \"166f5e85-d96f-490c-9606-f7df3d47809c\") " pod="openshift-must-gather-x7dhv/must-gather-k96rn" Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.537251 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tbsh\" (UniqueName: \"kubernetes.io/projected/166f5e85-d96f-490c-9606-f7df3d47809c-kube-api-access-9tbsh\") pod \"must-gather-k96rn\" (UID: \"166f5e85-d96f-490c-9606-f7df3d47809c\") " pod="openshift-must-gather-x7dhv/must-gather-k96rn" Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.638808 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tbsh\" (UniqueName: \"kubernetes.io/projected/166f5e85-d96f-490c-9606-f7df3d47809c-kube-api-access-9tbsh\") pod \"must-gather-k96rn\" (UID: \"166f5e85-d96f-490c-9606-f7df3d47809c\") " pod="openshift-must-gather-x7dhv/must-gather-k96rn" Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.639516 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/166f5e85-d96f-490c-9606-f7df3d47809c-must-gather-output\") pod \"must-gather-k96rn\" (UID: \"166f5e85-d96f-490c-9606-f7df3d47809c\") " pod="openshift-must-gather-x7dhv/must-gather-k96rn" Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.639914 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/166f5e85-d96f-490c-9606-f7df3d47809c-must-gather-output\") pod \"must-gather-k96rn\" (UID: \"166f5e85-d96f-490c-9606-f7df3d47809c\") " pod="openshift-must-gather-x7dhv/must-gather-k96rn" Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.665589 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tbsh\" (UniqueName: \"kubernetes.io/projected/166f5e85-d96f-490c-9606-f7df3d47809c-kube-api-access-9tbsh\") pod \"must-gather-k96rn\" (UID: \"166f5e85-d96f-490c-9606-f7df3d47809c\") " pod="openshift-must-gather-x7dhv/must-gather-k96rn" Dec 11 06:25:47 crc kubenswrapper[4628]: I1211 06:25:47.771664 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/must-gather-k96rn" Dec 11 06:25:48 crc kubenswrapper[4628]: I1211 06:25:48.328399 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x7dhv/must-gather-k96rn"] Dec 11 06:25:49 crc kubenswrapper[4628]: I1211 06:25:49.111251 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7dhv/must-gather-k96rn" event={"ID":"166f5e85-d96f-490c-9606-f7df3d47809c","Type":"ContainerStarted","Data":"b02d142b3f20b984e99ec2460d540772ed7704499153422aac4daf10aaa5d036"} Dec 11 06:25:56 crc kubenswrapper[4628]: I1211 06:25:56.176282 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7dhv/must-gather-k96rn" event={"ID":"166f5e85-d96f-490c-9606-f7df3d47809c","Type":"ContainerStarted","Data":"3d576c4f365aa6fff953fed907aa6081e0435bae1d727df7161b962880881e3e"} Dec 11 06:25:56 crc kubenswrapper[4628]: I1211 06:25:56.176910 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7dhv/must-gather-k96rn" event={"ID":"166f5e85-d96f-490c-9606-f7df3d47809c","Type":"ContainerStarted","Data":"7ffa5292350464bd1bc6babce2b2cd0a4db290d7dd03af9290f2401bb17484cb"} Dec 11 06:26:00 crc kubenswrapper[4628]: I1211 06:26:00.389254 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x7dhv/must-gather-k96rn" podStartSLOduration=6.116037754 podStartE2EDuration="13.389236683s" podCreationTimestamp="2025-12-11 06:25:47 +0000 UTC" firstStartedPulling="2025-12-11 06:25:48.317324137 +0000 UTC m=+4250.734670835" lastFinishedPulling="2025-12-11 06:25:55.590523066 +0000 UTC m=+4258.007869764" observedRunningTime="2025-12-11 06:25:56.198868354 +0000 UTC m=+4258.616215062" watchObservedRunningTime="2025-12-11 06:26:00.389236683 +0000 UTC m=+4262.806583381" Dec 11 06:26:00 crc kubenswrapper[4628]: I1211 06:26:00.393421 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7dhv/crc-debug-gf6hs"] Dec 11 06:26:00 crc kubenswrapper[4628]: I1211 06:26:00.395301 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" Dec 11 06:26:00 crc kubenswrapper[4628]: I1211 06:26:00.515236 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d-host\") pod \"crc-debug-gf6hs\" (UID: \"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d\") " pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" Dec 11 06:26:00 crc kubenswrapper[4628]: I1211 06:26:00.515472 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58s95\" (UniqueName: \"kubernetes.io/projected/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d-kube-api-access-58s95\") pod \"crc-debug-gf6hs\" (UID: \"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d\") " pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" Dec 11 06:26:00 crc kubenswrapper[4628]: I1211 06:26:00.617487 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58s95\" (UniqueName: \"kubernetes.io/projected/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d-kube-api-access-58s95\") pod \"crc-debug-gf6hs\" (UID: \"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d\") " pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" Dec 11 06:26:00 crc kubenswrapper[4628]: I1211 06:26:00.617576 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d-host\") pod \"crc-debug-gf6hs\" (UID: \"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d\") " pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" Dec 11 06:26:00 crc kubenswrapper[4628]: I1211 06:26:00.617758 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d-host\") pod \"crc-debug-gf6hs\" (UID: \"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d\") " pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" Dec 11 06:26:00 crc kubenswrapper[4628]: I1211 06:26:00.635719 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58s95\" (UniqueName: \"kubernetes.io/projected/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d-kube-api-access-58s95\") pod \"crc-debug-gf6hs\" (UID: \"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d\") " pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" Dec 11 06:26:00 crc kubenswrapper[4628]: I1211 06:26:00.713366 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" Dec 11 06:26:01 crc kubenswrapper[4628]: I1211 06:26:01.219717 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" event={"ID":"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d","Type":"ContainerStarted","Data":"f8febca6df76c05cdf6ef4da600b483ba0c9a036f183ca9e661c78b8fb0f16bf"} Dec 11 06:26:01 crc kubenswrapper[4628]: I1211 06:26:01.427558 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:26:01 crc kubenswrapper[4628]: I1211 06:26:01.427693 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:26:13 crc kubenswrapper[4628]: I1211 06:26:13.357712 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" event={"ID":"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d","Type":"ContainerStarted","Data":"9fa8d7c9af79bfc3fe3320865334f1b60091dbb7252158eeaa3d570e62b63b2b"} Dec 11 06:26:13 crc kubenswrapper[4628]: I1211 06:26:13.395607 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" podStartSLOduration=1.8386214349999999 podStartE2EDuration="13.395580719s" podCreationTimestamp="2025-12-11 06:26:00 +0000 UTC" firstStartedPulling="2025-12-11 06:26:00.765349425 +0000 UTC m=+4263.182696123" lastFinishedPulling="2025-12-11 06:26:12.322308709 +0000 UTC m=+4274.739655407" observedRunningTime="2025-12-11 06:26:13.370977552 +0000 UTC m=+4275.788324280" watchObservedRunningTime="2025-12-11 06:26:13.395580719 +0000 UTC m=+4275.812927457" Dec 11 06:26:31 crc kubenswrapper[4628]: I1211 06:26:31.426667 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:26:31 crc kubenswrapper[4628]: I1211 06:26:31.427232 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:26:31 crc kubenswrapper[4628]: I1211 06:26:31.427275 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 06:26:31 crc kubenswrapper[4628]: I1211 06:26:31.428073 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 06:26:31 crc kubenswrapper[4628]: I1211 06:26:31.428156 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" gracePeriod=600 Dec 11 06:26:31 crc kubenswrapper[4628]: E1211 06:26:31.566787 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:26:32 crc kubenswrapper[4628]: I1211 06:26:32.540110 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" exitCode=0 Dec 11 06:26:32 crc kubenswrapper[4628]: I1211 06:26:32.540276 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa"} Dec 11 06:26:32 crc kubenswrapper[4628]: I1211 06:26:32.540821 4628 scope.go:117] "RemoveContainer" containerID="3f7b78a4d45fc2941babafd07cee8c3222be00b496ea76158561b39fc28da3cd" Dec 11 06:26:32 crc kubenswrapper[4628]: I1211 06:26:32.541617 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:26:32 crc kubenswrapper[4628]: E1211 06:26:32.542035 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:26:44 crc kubenswrapper[4628]: I1211 06:26:44.889423 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:26:44 crc kubenswrapper[4628]: E1211 06:26:44.890286 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:26:57 crc kubenswrapper[4628]: I1211 06:26:57.902693 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:26:57 crc kubenswrapper[4628]: E1211 06:26:57.906262 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:27:05 crc kubenswrapper[4628]: I1211 06:27:05.825345 4628 generic.go:334] "Generic (PLEG): container finished" podID="1313eb66-062c-4ff0-9cb3-2e89dd56ca0d" containerID="9fa8d7c9af79bfc3fe3320865334f1b60091dbb7252158eeaa3d570e62b63b2b" exitCode=0 Dec 11 06:27:05 crc kubenswrapper[4628]: I1211 06:27:05.825834 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" event={"ID":"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d","Type":"ContainerDied","Data":"9fa8d7c9af79bfc3fe3320865334f1b60091dbb7252158eeaa3d570e62b63b2b"} Dec 11 06:27:06 crc kubenswrapper[4628]: I1211 06:27:06.959814 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" Dec 11 06:27:07 crc kubenswrapper[4628]: I1211 06:27:07.001691 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x7dhv/crc-debug-gf6hs"] Dec 11 06:27:07 crc kubenswrapper[4628]: I1211 06:27:07.009963 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x7dhv/crc-debug-gf6hs"] Dec 11 06:27:07 crc kubenswrapper[4628]: I1211 06:27:07.052735 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58s95\" (UniqueName: \"kubernetes.io/projected/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d-kube-api-access-58s95\") pod \"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d\" (UID: \"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d\") " Dec 11 06:27:07 crc kubenswrapper[4628]: I1211 06:27:07.052812 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d-host\") pod \"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d\" (UID: \"1313eb66-062c-4ff0-9cb3-2e89dd56ca0d\") " Dec 11 06:27:07 crc kubenswrapper[4628]: I1211 06:27:07.053208 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d-host" (OuterVolumeSpecName: "host") pod "1313eb66-062c-4ff0-9cb3-2e89dd56ca0d" (UID: "1313eb66-062c-4ff0-9cb3-2e89dd56ca0d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 06:27:07 crc kubenswrapper[4628]: I1211 06:27:07.053570 4628 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d-host\") on node \"crc\" DevicePath \"\"" Dec 11 06:27:07 crc kubenswrapper[4628]: I1211 06:27:07.063057 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d-kube-api-access-58s95" (OuterVolumeSpecName: "kube-api-access-58s95") pod "1313eb66-062c-4ff0-9cb3-2e89dd56ca0d" (UID: "1313eb66-062c-4ff0-9cb3-2e89dd56ca0d"). InnerVolumeSpecName "kube-api-access-58s95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:27:07 crc kubenswrapper[4628]: I1211 06:27:07.155203 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58s95\" (UniqueName: \"kubernetes.io/projected/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d-kube-api-access-58s95\") on node \"crc\" DevicePath \"\"" Dec 11 06:27:07 crc kubenswrapper[4628]: I1211 06:27:07.845637 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8febca6df76c05cdf6ef4da600b483ba0c9a036f183ca9e661c78b8fb0f16bf" Dec 11 06:27:07 crc kubenswrapper[4628]: I1211 06:27:07.845705 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/crc-debug-gf6hs" Dec 11 06:27:07 crc kubenswrapper[4628]: I1211 06:27:07.916078 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1313eb66-062c-4ff0-9cb3-2e89dd56ca0d" path="/var/lib/kubelet/pods/1313eb66-062c-4ff0-9cb3-2e89dd56ca0d/volumes" Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.210171 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7dhv/crc-debug-j7pxg"] Dec 11 06:27:08 crc kubenswrapper[4628]: E1211 06:27:08.210653 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1313eb66-062c-4ff0-9cb3-2e89dd56ca0d" containerName="container-00" Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.210669 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="1313eb66-062c-4ff0-9cb3-2e89dd56ca0d" containerName="container-00" Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.211097 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="1313eb66-062c-4ff0-9cb3-2e89dd56ca0d" containerName="container-00" Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.211912 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.275326 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l72ph\" (UniqueName: \"kubernetes.io/projected/aeb30034-67ca-4e09-9fa6-6061b799461f-kube-api-access-l72ph\") pod \"crc-debug-j7pxg\" (UID: \"aeb30034-67ca-4e09-9fa6-6061b799461f\") " pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.275595 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aeb30034-67ca-4e09-9fa6-6061b799461f-host\") pod \"crc-debug-j7pxg\" (UID: \"aeb30034-67ca-4e09-9fa6-6061b799461f\") " pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.377379 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aeb30034-67ca-4e09-9fa6-6061b799461f-host\") pod \"crc-debug-j7pxg\" (UID: \"aeb30034-67ca-4e09-9fa6-6061b799461f\") " pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.377515 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l72ph\" (UniqueName: \"kubernetes.io/projected/aeb30034-67ca-4e09-9fa6-6061b799461f-kube-api-access-l72ph\") pod \"crc-debug-j7pxg\" (UID: \"aeb30034-67ca-4e09-9fa6-6061b799461f\") " pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.377560 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aeb30034-67ca-4e09-9fa6-6061b799461f-host\") pod \"crc-debug-j7pxg\" (UID: \"aeb30034-67ca-4e09-9fa6-6061b799461f\") " pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.396294 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l72ph\" (UniqueName: \"kubernetes.io/projected/aeb30034-67ca-4e09-9fa6-6061b799461f-kube-api-access-l72ph\") pod \"crc-debug-j7pxg\" (UID: \"aeb30034-67ca-4e09-9fa6-6061b799461f\") " pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.530258 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.856633 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" event={"ID":"aeb30034-67ca-4e09-9fa6-6061b799461f","Type":"ContainerStarted","Data":"c6df323bd3a6bc742a982e191c75f7dcbbd5cc6c63ddc608adb16fc2747edcc7"} Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.856689 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" event={"ID":"aeb30034-67ca-4e09-9fa6-6061b799461f","Type":"ContainerStarted","Data":"930cc00002036cd60f1829cfef02e6c2ac6a74a3d6414ddd45a327c0d811cbe7"} Dec 11 06:27:08 crc kubenswrapper[4628]: I1211 06:27:08.873608 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" podStartSLOduration=0.873584047 podStartE2EDuration="873.584047ms" podCreationTimestamp="2025-12-11 06:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 06:27:08.868680132 +0000 UTC m=+4331.286026820" watchObservedRunningTime="2025-12-11 06:27:08.873584047 +0000 UTC m=+4331.290930745" Dec 11 06:27:09 crc kubenswrapper[4628]: I1211 06:27:09.865004 4628 generic.go:334] "Generic (PLEG): container finished" podID="aeb30034-67ca-4e09-9fa6-6061b799461f" containerID="c6df323bd3a6bc742a982e191c75f7dcbbd5cc6c63ddc608adb16fc2747edcc7" exitCode=0 Dec 11 06:27:09 crc kubenswrapper[4628]: I1211 06:27:09.865140 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" event={"ID":"aeb30034-67ca-4e09-9fa6-6061b799461f","Type":"ContainerDied","Data":"c6df323bd3a6bc742a982e191c75f7dcbbd5cc6c63ddc608adb16fc2747edcc7"} Dec 11 06:27:10 crc kubenswrapper[4628]: I1211 06:27:10.998806 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" Dec 11 06:27:11 crc kubenswrapper[4628]: I1211 06:27:11.020541 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aeb30034-67ca-4e09-9fa6-6061b799461f-host\") pod \"aeb30034-67ca-4e09-9fa6-6061b799461f\" (UID: \"aeb30034-67ca-4e09-9fa6-6061b799461f\") " Dec 11 06:27:11 crc kubenswrapper[4628]: I1211 06:27:11.020623 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aeb30034-67ca-4e09-9fa6-6061b799461f-host" (OuterVolumeSpecName: "host") pod "aeb30034-67ca-4e09-9fa6-6061b799461f" (UID: "aeb30034-67ca-4e09-9fa6-6061b799461f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 06:27:11 crc kubenswrapper[4628]: I1211 06:27:11.020714 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l72ph\" (UniqueName: \"kubernetes.io/projected/aeb30034-67ca-4e09-9fa6-6061b799461f-kube-api-access-l72ph\") pod \"aeb30034-67ca-4e09-9fa6-6061b799461f\" (UID: \"aeb30034-67ca-4e09-9fa6-6061b799461f\") " Dec 11 06:27:11 crc kubenswrapper[4628]: I1211 06:27:11.021822 4628 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aeb30034-67ca-4e09-9fa6-6061b799461f-host\") on node \"crc\" DevicePath \"\"" Dec 11 06:27:11 crc kubenswrapper[4628]: I1211 06:27:11.025738 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb30034-67ca-4e09-9fa6-6061b799461f-kube-api-access-l72ph" (OuterVolumeSpecName: "kube-api-access-l72ph") pod "aeb30034-67ca-4e09-9fa6-6061b799461f" (UID: "aeb30034-67ca-4e09-9fa6-6061b799461f"). InnerVolumeSpecName "kube-api-access-l72ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:27:11 crc kubenswrapper[4628]: I1211 06:27:11.057824 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x7dhv/crc-debug-j7pxg"] Dec 11 06:27:11 crc kubenswrapper[4628]: I1211 06:27:11.069231 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x7dhv/crc-debug-j7pxg"] Dec 11 06:27:11 crc kubenswrapper[4628]: I1211 06:27:11.125330 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l72ph\" (UniqueName: \"kubernetes.io/projected/aeb30034-67ca-4e09-9fa6-6061b799461f-kube-api-access-l72ph\") on node \"crc\" DevicePath \"\"" Dec 11 06:27:11 crc kubenswrapper[4628]: I1211 06:27:11.889945 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:27:11 crc kubenswrapper[4628]: E1211 06:27:11.890521 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:27:11 crc kubenswrapper[4628]: I1211 06:27:11.904067 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/crc-debug-j7pxg" Dec 11 06:27:11 crc kubenswrapper[4628]: I1211 06:27:11.920554 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb30034-67ca-4e09-9fa6-6061b799461f" path="/var/lib/kubelet/pods/aeb30034-67ca-4e09-9fa6-6061b799461f/volumes" Dec 11 06:27:11 crc kubenswrapper[4628]: I1211 06:27:11.921290 4628 scope.go:117] "RemoveContainer" containerID="c6df323bd3a6bc742a982e191c75f7dcbbd5cc6c63ddc608adb16fc2747edcc7" Dec 11 06:27:12 crc kubenswrapper[4628]: I1211 06:27:12.237601 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x7dhv/crc-debug-gzl8h"] Dec 11 06:27:12 crc kubenswrapper[4628]: E1211 06:27:12.240588 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb30034-67ca-4e09-9fa6-6061b799461f" containerName="container-00" Dec 11 06:27:12 crc kubenswrapper[4628]: I1211 06:27:12.240744 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb30034-67ca-4e09-9fa6-6061b799461f" containerName="container-00" Dec 11 06:27:12 crc kubenswrapper[4628]: I1211 06:27:12.241176 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb30034-67ca-4e09-9fa6-6061b799461f" containerName="container-00" Dec 11 06:27:12 crc kubenswrapper[4628]: I1211 06:27:12.242340 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/crc-debug-gzl8h" Dec 11 06:27:12 crc kubenswrapper[4628]: I1211 06:27:12.345325 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clszn\" (UniqueName: \"kubernetes.io/projected/355de5f2-1daa-4c8b-9073-07d5cabef696-kube-api-access-clszn\") pod \"crc-debug-gzl8h\" (UID: \"355de5f2-1daa-4c8b-9073-07d5cabef696\") " pod="openshift-must-gather-x7dhv/crc-debug-gzl8h" Dec 11 06:27:12 crc kubenswrapper[4628]: I1211 06:27:12.345391 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/355de5f2-1daa-4c8b-9073-07d5cabef696-host\") pod \"crc-debug-gzl8h\" (UID: \"355de5f2-1daa-4c8b-9073-07d5cabef696\") " pod="openshift-must-gather-x7dhv/crc-debug-gzl8h" Dec 11 06:27:12 crc kubenswrapper[4628]: I1211 06:27:12.447474 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clszn\" (UniqueName: \"kubernetes.io/projected/355de5f2-1daa-4c8b-9073-07d5cabef696-kube-api-access-clszn\") pod \"crc-debug-gzl8h\" (UID: \"355de5f2-1daa-4c8b-9073-07d5cabef696\") " pod="openshift-must-gather-x7dhv/crc-debug-gzl8h" Dec 11 06:27:12 crc kubenswrapper[4628]: I1211 06:27:12.447566 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/355de5f2-1daa-4c8b-9073-07d5cabef696-host\") pod \"crc-debug-gzl8h\" (UID: \"355de5f2-1daa-4c8b-9073-07d5cabef696\") " pod="openshift-must-gather-x7dhv/crc-debug-gzl8h" Dec 11 06:27:12 crc kubenswrapper[4628]: I1211 06:27:12.447638 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/355de5f2-1daa-4c8b-9073-07d5cabef696-host\") pod \"crc-debug-gzl8h\" (UID: \"355de5f2-1daa-4c8b-9073-07d5cabef696\") " pod="openshift-must-gather-x7dhv/crc-debug-gzl8h" Dec 11 06:27:12 crc kubenswrapper[4628]: I1211 06:27:12.465541 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clszn\" (UniqueName: \"kubernetes.io/projected/355de5f2-1daa-4c8b-9073-07d5cabef696-kube-api-access-clszn\") pod \"crc-debug-gzl8h\" (UID: \"355de5f2-1daa-4c8b-9073-07d5cabef696\") " pod="openshift-must-gather-x7dhv/crc-debug-gzl8h" Dec 11 06:27:12 crc kubenswrapper[4628]: I1211 06:27:12.561930 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/crc-debug-gzl8h" Dec 11 06:27:12 crc kubenswrapper[4628]: W1211 06:27:12.602371 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod355de5f2_1daa_4c8b_9073_07d5cabef696.slice/crio-1f95e0cb0c93014d96e89580a605322aea68f5e4dcb9fee789b5fe72631317bf WatchSource:0}: Error finding container 1f95e0cb0c93014d96e89580a605322aea68f5e4dcb9fee789b5fe72631317bf: Status 404 returned error can't find the container with id 1f95e0cb0c93014d96e89580a605322aea68f5e4dcb9fee789b5fe72631317bf Dec 11 06:27:12 crc kubenswrapper[4628]: I1211 06:27:12.914737 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7dhv/crc-debug-gzl8h" event={"ID":"355de5f2-1daa-4c8b-9073-07d5cabef696","Type":"ContainerStarted","Data":"1f95e0cb0c93014d96e89580a605322aea68f5e4dcb9fee789b5fe72631317bf"} Dec 11 06:27:13 crc kubenswrapper[4628]: I1211 06:27:13.932039 4628 generic.go:334] "Generic (PLEG): container finished" podID="355de5f2-1daa-4c8b-9073-07d5cabef696" containerID="c8caad6dc214fa5a1fc9e3539225fc3cf581712a7398b2aa1079ca04befff119" exitCode=0 Dec 11 06:27:13 crc kubenswrapper[4628]: I1211 06:27:13.932183 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7dhv/crc-debug-gzl8h" event={"ID":"355de5f2-1daa-4c8b-9073-07d5cabef696","Type":"ContainerDied","Data":"c8caad6dc214fa5a1fc9e3539225fc3cf581712a7398b2aa1079ca04befff119"} Dec 11 06:27:13 crc kubenswrapper[4628]: I1211 06:27:13.983369 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x7dhv/crc-debug-gzl8h"] Dec 11 06:27:13 crc kubenswrapper[4628]: I1211 06:27:13.991536 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x7dhv/crc-debug-gzl8h"] Dec 11 06:27:15 crc kubenswrapper[4628]: I1211 06:27:15.291740 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/crc-debug-gzl8h" Dec 11 06:27:15 crc kubenswrapper[4628]: I1211 06:27:15.402726 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clszn\" (UniqueName: \"kubernetes.io/projected/355de5f2-1daa-4c8b-9073-07d5cabef696-kube-api-access-clszn\") pod \"355de5f2-1daa-4c8b-9073-07d5cabef696\" (UID: \"355de5f2-1daa-4c8b-9073-07d5cabef696\") " Dec 11 06:27:15 crc kubenswrapper[4628]: I1211 06:27:15.402810 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/355de5f2-1daa-4c8b-9073-07d5cabef696-host\") pod \"355de5f2-1daa-4c8b-9073-07d5cabef696\" (UID: \"355de5f2-1daa-4c8b-9073-07d5cabef696\") " Dec 11 06:27:15 crc kubenswrapper[4628]: I1211 06:27:15.402918 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/355de5f2-1daa-4c8b-9073-07d5cabef696-host" (OuterVolumeSpecName: "host") pod "355de5f2-1daa-4c8b-9073-07d5cabef696" (UID: "355de5f2-1daa-4c8b-9073-07d5cabef696"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 06:27:15 crc kubenswrapper[4628]: I1211 06:27:15.403158 4628 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/355de5f2-1daa-4c8b-9073-07d5cabef696-host\") on node \"crc\" DevicePath \"\"" Dec 11 06:27:15 crc kubenswrapper[4628]: I1211 06:27:15.413052 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/355de5f2-1daa-4c8b-9073-07d5cabef696-kube-api-access-clszn" (OuterVolumeSpecName: "kube-api-access-clszn") pod "355de5f2-1daa-4c8b-9073-07d5cabef696" (UID: "355de5f2-1daa-4c8b-9073-07d5cabef696"). InnerVolumeSpecName "kube-api-access-clszn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:27:15 crc kubenswrapper[4628]: I1211 06:27:15.504505 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clszn\" (UniqueName: \"kubernetes.io/projected/355de5f2-1daa-4c8b-9073-07d5cabef696-kube-api-access-clszn\") on node \"crc\" DevicePath \"\"" Dec 11 06:27:15 crc kubenswrapper[4628]: I1211 06:27:15.901033 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="355de5f2-1daa-4c8b-9073-07d5cabef696" path="/var/lib/kubelet/pods/355de5f2-1daa-4c8b-9073-07d5cabef696/volumes" Dec 11 06:27:15 crc kubenswrapper[4628]: I1211 06:27:15.948493 4628 scope.go:117] "RemoveContainer" containerID="c8caad6dc214fa5a1fc9e3539225fc3cf581712a7398b2aa1079ca04befff119" Dec 11 06:27:15 crc kubenswrapper[4628]: I1211 06:27:15.948544 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/crc-debug-gzl8h" Dec 11 06:27:26 crc kubenswrapper[4628]: I1211 06:27:26.890397 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:27:26 crc kubenswrapper[4628]: E1211 06:27:26.891113 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:27:32 crc kubenswrapper[4628]: I1211 06:27:32.394582 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-f64c5f7b6-ctrn9_b3b9644d-2578-4628-ac4c-28d16e0657e0/barbican-api/0.log" Dec 11 06:27:32 crc kubenswrapper[4628]: I1211 06:27:32.531324 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-f64c5f7b6-ctrn9_b3b9644d-2578-4628-ac4c-28d16e0657e0/barbican-api-log/0.log" Dec 11 06:27:32 crc kubenswrapper[4628]: I1211 06:27:32.625743 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-b5c776c64-wmwpw_391ad8e5-9c8c-463c-8d25-4d74e3f8cf94/barbican-keystone-listener/0.log" Dec 11 06:27:32 crc kubenswrapper[4628]: I1211 06:27:32.671242 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-b5c776c64-wmwpw_391ad8e5-9c8c-463c-8d25-4d74e3f8cf94/barbican-keystone-listener-log/0.log" Dec 11 06:27:32 crc kubenswrapper[4628]: I1211 06:27:32.835626 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-759d9b665f-6pnnw_b223e08d-3dfd-4c2d-b720-fe142822a27c/barbican-worker/0.log" Dec 11 06:27:32 crc kubenswrapper[4628]: I1211 06:27:32.953160 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-759d9b665f-6pnnw_b223e08d-3dfd-4c2d-b720-fe142822a27c/barbican-worker-log/0.log" Dec 11 06:27:33 crc kubenswrapper[4628]: I1211 06:27:33.076359 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98_376d3aeb-b569-4e4e-847a-762ed8f12b35/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:33 crc kubenswrapper[4628]: I1211 06:27:33.194689 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0091ba0-9c70-41dd-8f21-68968a10a308/ceilometer-central-agent/0.log" Dec 11 06:27:33 crc kubenswrapper[4628]: I1211 06:27:33.233984 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0091ba0-9c70-41dd-8f21-68968a10a308/ceilometer-notification-agent/0.log" Dec 11 06:27:33 crc kubenswrapper[4628]: I1211 06:27:33.387080 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0091ba0-9c70-41dd-8f21-68968a10a308/sg-core/0.log" Dec 11 06:27:33 crc kubenswrapper[4628]: I1211 06:27:33.430126 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0091ba0-9c70-41dd-8f21-68968a10a308/proxy-httpd/0.log" Dec 11 06:27:33 crc kubenswrapper[4628]: I1211 06:27:33.505596 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_12485bc3-6a23-4772-8c00-2148c65fe10d/cinder-api/0.log" Dec 11 06:27:33 crc kubenswrapper[4628]: I1211 06:27:33.651117 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_12485bc3-6a23-4772-8c00-2148c65fe10d/cinder-api-log/0.log" Dec 11 06:27:33 crc kubenswrapper[4628]: I1211 06:27:33.784999 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_df6f8836-27ef-4cbd-aed1-0949861716db/cinder-scheduler/0.log" Dec 11 06:27:33 crc kubenswrapper[4628]: I1211 06:27:33.794101 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_df6f8836-27ef-4cbd-aed1-0949861716db/probe/0.log" Dec 11 06:27:34 crc kubenswrapper[4628]: I1211 06:27:34.720960 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg_74ebd783-bcc7-4521-a9f2-450201f04c18/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:34 crc kubenswrapper[4628]: I1211 06:27:34.748274 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d_6838fcd4-0c2b-4c92-880c-eb9029af8a00/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:34 crc kubenswrapper[4628]: I1211 06:27:34.944386 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-gz9jh_1842054b-c613-4c76-9cb8-3738bc44a946/init/0.log" Dec 11 06:27:35 crc kubenswrapper[4628]: I1211 06:27:35.168572 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-gz9jh_1842054b-c613-4c76-9cb8-3738bc44a946/init/0.log" Dec 11 06:27:35 crc kubenswrapper[4628]: I1211 06:27:35.290063 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd_4416beb7-730c-4898-b603-a123279eb238/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:35 crc kubenswrapper[4628]: I1211 06:27:35.334926 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-gz9jh_1842054b-c613-4c76-9cb8-3738bc44a946/dnsmasq-dns/0.log" Dec 11 06:27:36 crc kubenswrapper[4628]: I1211 06:27:36.120901 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_634cddf9-405e-42ee-a106-3c99b8d921d1/glance-log/0.log" Dec 11 06:27:36 crc kubenswrapper[4628]: I1211 06:27:36.238357 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_634cddf9-405e-42ee-a106-3c99b8d921d1/glance-httpd/0.log" Dec 11 06:27:36 crc kubenswrapper[4628]: I1211 06:27:36.273427 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_83368b19-5867-444e-a7ea-55683f0e6b26/glance-httpd/0.log" Dec 11 06:27:36 crc kubenswrapper[4628]: I1211 06:27:36.456014 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_83368b19-5867-444e-a7ea-55683f0e6b26/glance-log/0.log" Dec 11 06:27:36 crc kubenswrapper[4628]: I1211 06:27:36.628374 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7989644c86-scmh4_51e02694-e634-4a3b-8406-3b3b72007c2b/horizon/0.log" Dec 11 06:27:36 crc kubenswrapper[4628]: I1211 06:27:36.889397 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7989644c86-scmh4_51e02694-e634-4a3b-8406-3b3b72007c2b/horizon-log/0.log" Dec 11 06:27:36 crc kubenswrapper[4628]: I1211 06:27:36.905928 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v_cbe8ae7a-0268-477b-a232-fb89a86e6c30/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:36 crc kubenswrapper[4628]: I1211 06:27:36.982761 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-jhbk5_7802f047-ef49-4339-8783-fa927f841103/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:37 crc kubenswrapper[4628]: I1211 06:27:37.259223 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29423881-sxmlb_c82db411-744d-4cc8-8ae5-3031c70241d4/keystone-cron/0.log" Dec 11 06:27:37 crc kubenswrapper[4628]: I1211 06:27:37.390606 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6dd99782-66a8-47e8-a4cf-d5f2805655dc/kube-state-metrics/0.log" Dec 11 06:27:37 crc kubenswrapper[4628]: I1211 06:27:37.659119 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55f69568c9-2p2zq_3b27f97f-7392-47aa-8551-badeb28bce06/keystone-api/0.log" Dec 11 06:27:37 crc kubenswrapper[4628]: I1211 06:27:37.738200 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw_10745043-8954-4864-9b9b-d3b2e8614e36/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:38 crc kubenswrapper[4628]: I1211 06:27:38.567405 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7_c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:38 crc kubenswrapper[4628]: I1211 06:27:38.815325 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67df497849-l9zzv_9d505505-13f5-4899-b1a5-7f739066e73c/neutron-httpd/0.log" Dec 11 06:27:38 crc kubenswrapper[4628]: I1211 06:27:38.999576 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67df497849-l9zzv_9d505505-13f5-4899-b1a5-7f739066e73c/neutron-api/0.log" Dec 11 06:27:39 crc kubenswrapper[4628]: I1211 06:27:39.579185 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ccc5f4b2-0364-42a0-abba-16c0e471f5c6/nova-cell0-conductor-conductor/0.log" Dec 11 06:27:39 crc kubenswrapper[4628]: I1211 06:27:39.779945 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_48ebe2ed-1819-4bd2-9f26-e8f392645684/nova-cell1-conductor-conductor/0.log" Dec 11 06:27:40 crc kubenswrapper[4628]: I1211 06:27:40.169451 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d1b29ff2-7a02-42ed-9dde-d998ad2e693f/nova-cell1-novncproxy-novncproxy/0.log" Dec 11 06:27:40 crc kubenswrapper[4628]: I1211 06:27:40.225316 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_dfa4c6b3-3b46-4383-8813-38a038d8e0da/nova-api-log/0.log" Dec 11 06:27:40 crc kubenswrapper[4628]: I1211 06:27:40.534667 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-fz78t_1b0b9e64-e4c3-4250-ae8d-319461717fcd/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:40 crc kubenswrapper[4628]: I1211 06:27:40.593656 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_dfa4c6b3-3b46-4383-8813-38a038d8e0da/nova-api-api/0.log" Dec 11 06:27:40 crc kubenswrapper[4628]: I1211 06:27:40.624255 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a14c2329-c910-45b3-a28c-258f07a31c5f/nova-metadata-log/0.log" Dec 11 06:27:41 crc kubenswrapper[4628]: I1211 06:27:41.075763 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f5879bd1-c58f-4c7a-8158-8be2bd632bf8/mysql-bootstrap/0.log" Dec 11 06:27:41 crc kubenswrapper[4628]: I1211 06:27:41.328632 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f5879bd1-c58f-4c7a-8158-8be2bd632bf8/galera/0.log" Dec 11 06:27:41 crc kubenswrapper[4628]: I1211 06:27:41.348475 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f5879bd1-c58f-4c7a-8158-8be2bd632bf8/mysql-bootstrap/0.log" Dec 11 06:27:41 crc kubenswrapper[4628]: I1211 06:27:41.353999 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_997489dd-97c9-4359-9920-8bdb512f708b/nova-scheduler-scheduler/0.log" Dec 11 06:27:41 crc kubenswrapper[4628]: I1211 06:27:41.577094 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e4498a18-7449-45b3-9061-d3ffbfa4be5b/mysql-bootstrap/0.log" Dec 11 06:27:41 crc kubenswrapper[4628]: I1211 06:27:41.892813 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:27:41 crc kubenswrapper[4628]: E1211 06:27:41.893025 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:27:41 crc kubenswrapper[4628]: I1211 06:27:41.935642 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e4498a18-7449-45b3-9061-d3ffbfa4be5b/mysql-bootstrap/0.log" Dec 11 06:27:41 crc kubenswrapper[4628]: I1211 06:27:41.943996 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e4498a18-7449-45b3-9061-d3ffbfa4be5b/galera/0.log" Dec 11 06:27:42 crc kubenswrapper[4628]: I1211 06:27:42.326789 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a14c2329-c910-45b3-a28c-258f07a31c5f/nova-metadata-metadata/0.log" Dec 11 06:27:42 crc kubenswrapper[4628]: I1211 06:27:42.520905 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_75983d64-ba11-4ef7-a433-34863bd80b58/openstackclient/0.log" Dec 11 06:27:42 crc kubenswrapper[4628]: I1211 06:27:42.536529 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-szj8g_c4361dca-0563-4576-a32a-2f03e4f399a0/openstack-network-exporter/0.log" Dec 11 06:27:42 crc kubenswrapper[4628]: I1211 06:27:42.749662 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gncbg_7c72a5ae-bbee-41cd-bb23-b9feb77f594d/ovsdb-server-init/0.log" Dec 11 06:27:43 crc kubenswrapper[4628]: I1211 06:27:43.044655 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gncbg_7c72a5ae-bbee-41cd-bb23-b9feb77f594d/ovsdb-server/0.log" Dec 11 06:27:43 crc kubenswrapper[4628]: I1211 06:27:43.084617 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gncbg_7c72a5ae-bbee-41cd-bb23-b9feb77f594d/ovs-vswitchd/0.log" Dec 11 06:27:43 crc kubenswrapper[4628]: I1211 06:27:43.088097 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gncbg_7c72a5ae-bbee-41cd-bb23-b9feb77f594d/ovsdb-server-init/0.log" Dec 11 06:27:43 crc kubenswrapper[4628]: I1211 06:27:43.268087 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qz7fr_d0885fe4-936a-4a13-b4e5-4aeee593c242/ovn-controller/0.log" Dec 11 06:27:43 crc kubenswrapper[4628]: I1211 06:27:43.917814 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-drhds_5ab6a157-55db-4fda-8066-c9fee33d98b4/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:43 crc kubenswrapper[4628]: I1211 06:27:43.933133 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0ba5be80-485c-4b8b-8e1d-3326db7cc5a0/openstack-network-exporter/0.log" Dec 11 06:27:44 crc kubenswrapper[4628]: I1211 06:27:44.259560 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b16de833-9dc8-4e72-92b8-9374c7ab50bf/openstack-network-exporter/0.log" Dec 11 06:27:44 crc kubenswrapper[4628]: I1211 06:27:44.295694 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0ba5be80-485c-4b8b-8e1d-3326db7cc5a0/ovn-northd/0.log" Dec 11 06:27:44 crc kubenswrapper[4628]: I1211 06:27:44.296244 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b16de833-9dc8-4e72-92b8-9374c7ab50bf/ovsdbserver-nb/0.log" Dec 11 06:27:44 crc kubenswrapper[4628]: I1211 06:27:44.620348 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b45a8a8a-00cb-482a-bfc5-149e693949c1/openstack-network-exporter/0.log" Dec 11 06:27:44 crc kubenswrapper[4628]: I1211 06:27:44.636593 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b45a8a8a-00cb-482a-bfc5-149e693949c1/ovsdbserver-sb/0.log" Dec 11 06:27:44 crc kubenswrapper[4628]: I1211 06:27:44.929458 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-589776fd4-wpbmd_bf66d5d0-5466-4c43-ab27-23c603bd90f7/placement-api/0.log" Dec 11 06:27:45 crc kubenswrapper[4628]: I1211 06:27:45.146500 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-589776fd4-wpbmd_bf66d5d0-5466-4c43-ab27-23c603bd90f7/placement-log/0.log" Dec 11 06:27:45 crc kubenswrapper[4628]: I1211 06:27:45.450563 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_38ba9ced-55a9-40ad-8581-45f8d87da5ef/setup-container/0.log" Dec 11 06:27:45 crc kubenswrapper[4628]: I1211 06:27:45.664971 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_38ba9ced-55a9-40ad-8581-45f8d87da5ef/rabbitmq/0.log" Dec 11 06:27:45 crc kubenswrapper[4628]: I1211 06:27:45.666381 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_38ba9ced-55a9-40ad-8581-45f8d87da5ef/setup-container/0.log" Dec 11 06:27:45 crc kubenswrapper[4628]: I1211 06:27:45.772781 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3c89e316-b7b8-4740-aa49-0c21052a51de/setup-container/0.log" Dec 11 06:27:46 crc kubenswrapper[4628]: I1211 06:27:46.110676 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn_0e07dc05-985f-429b-8c55-221b86fb63be/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:46 crc kubenswrapper[4628]: I1211 06:27:46.114620 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3c89e316-b7b8-4740-aa49-0c21052a51de/setup-container/0.log" Dec 11 06:27:46 crc kubenswrapper[4628]: I1211 06:27:46.160681 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3c89e316-b7b8-4740-aa49-0c21052a51de/rabbitmq/0.log" Dec 11 06:27:46 crc kubenswrapper[4628]: I1211 06:27:46.692072 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-7cjnm_cf8e2426-3f6e-4291-b9ea-77b91670d471/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:46 crc kubenswrapper[4628]: I1211 06:27:46.719124 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55_67fc31e9-87aa-48c9-9888-52a10d0858dd/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:46 crc kubenswrapper[4628]: I1211 06:27:46.966395 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-qjbxj_381944d6-a058-41f8-a452-82d1933510e3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:47 crc kubenswrapper[4628]: I1211 06:27:47.107603 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-mfrfd_179d12ab-f93f-4ce5-a674-deed794d48f0/ssh-known-hosts-edpm-deployment/0.log" Dec 11 06:27:47 crc kubenswrapper[4628]: I1211 06:27:47.276313 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7588f48d9f-5vkfm_b87045d6-b3bc-468e-8121-1023f3f30de0/proxy-server/0.log" Dec 11 06:27:47 crc kubenswrapper[4628]: I1211 06:27:47.393711 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7588f48d9f-5vkfm_b87045d6-b3bc-468e-8121-1023f3f30de0/proxy-httpd/0.log" Dec 11 06:27:47 crc kubenswrapper[4628]: I1211 06:27:47.532484 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-t4rhc_5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1/swift-ring-rebalance/0.log" Dec 11 06:27:47 crc kubenswrapper[4628]: I1211 06:27:47.717974 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/account-reaper/0.log" Dec 11 06:27:47 crc kubenswrapper[4628]: I1211 06:27:47.742354 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/account-auditor/0.log" Dec 11 06:27:47 crc kubenswrapper[4628]: I1211 06:27:47.789223 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/account-replicator/0.log" Dec 11 06:27:47 crc kubenswrapper[4628]: I1211 06:27:47.863440 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/account-server/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.014182 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/container-auditor/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.053562 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/container-server/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.066565 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/container-replicator/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.165073 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/container-updater/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.303927 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/object-auditor/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.370892 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/object-expirer/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.443694 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/object-server/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.476804 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/object-replicator/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.584883 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/object-updater/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.642174 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/rsync/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.713445 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/swift-recon-cron/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.918824 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-f28db_70e52eb8-3a47-4192-9d87-3178a99becfe/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:48 crc kubenswrapper[4628]: I1211 06:27:48.997220 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_bdf75bdb-5535-4134-b9aa-f094e9e220fc/tempest-tests-tempest-tests-runner/0.log" Dec 11 06:27:49 crc kubenswrapper[4628]: I1211 06:27:49.148224 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4072e36d-49c0-40a9-93d2-5700ef264f8b/test-operator-logs-container/0.log" Dec 11 06:27:49 crc kubenswrapper[4628]: I1211 06:27:49.310966 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl_2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:27:55 crc kubenswrapper[4628]: I1211 06:27:55.890741 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:27:55 crc kubenswrapper[4628]: E1211 06:27:55.891471 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:28:03 crc kubenswrapper[4628]: I1211 06:28:03.820887 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a14ea6c9-f372-463b-8485-a3411412cbe9/memcached/0.log" Dec 11 06:28:08 crc kubenswrapper[4628]: I1211 06:28:08.889620 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:28:08 crc kubenswrapper[4628]: E1211 06:28:08.890470 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:28:20 crc kubenswrapper[4628]: I1211 06:28:20.890305 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:28:20 crc kubenswrapper[4628]: E1211 06:28:20.891227 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:28:23 crc kubenswrapper[4628]: I1211 06:28:23.455191 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/util/0.log" Dec 11 06:28:23 crc kubenswrapper[4628]: I1211 06:28:23.678112 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/pull/0.log" Dec 11 06:28:23 crc kubenswrapper[4628]: I1211 06:28:23.683821 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/util/0.log" Dec 11 06:28:23 crc kubenswrapper[4628]: I1211 06:28:23.684761 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/pull/0.log" Dec 11 06:28:23 crc kubenswrapper[4628]: I1211 06:28:23.910060 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/util/0.log" Dec 11 06:28:23 crc kubenswrapper[4628]: I1211 06:28:23.946079 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/pull/0.log" Dec 11 06:28:23 crc kubenswrapper[4628]: I1211 06:28:23.975774 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/extract/0.log" Dec 11 06:28:24 crc kubenswrapper[4628]: I1211 06:28:24.159378 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-9d9wj_f7d58419-0988-4a35-800f-2298db8e6597/kube-rbac-proxy/0.log" Dec 11 06:28:24 crc kubenswrapper[4628]: I1211 06:28:24.216493 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-9d9wj_f7d58419-0988-4a35-800f-2298db8e6597/manager/0.log" Dec 11 06:28:24 crc kubenswrapper[4628]: I1211 06:28:24.286277 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-h5xhk_43de67af-1cf5-4412-833e-e95e2ffcc47b/kube-rbac-proxy/0.log" Dec 11 06:28:24 crc kubenswrapper[4628]: I1211 06:28:24.455667 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-h5xhk_43de67af-1cf5-4412-833e-e95e2ffcc47b/manager/0.log" Dec 11 06:28:24 crc kubenswrapper[4628]: I1211 06:28:24.596009 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-whlx7_2f46589d-ec5b-48e9-8f64-741a6a5b3e84/kube-rbac-proxy/0.log" Dec 11 06:28:24 crc kubenswrapper[4628]: I1211 06:28:24.655911 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-whlx7_2f46589d-ec5b-48e9-8f64-741a6a5b3e84/manager/0.log" Dec 11 06:28:24 crc kubenswrapper[4628]: I1211 06:28:24.793251 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-v2lxt_dbdd3dcf-94cf-4b1e-9918-5d8efbe60360/kube-rbac-proxy/0.log" Dec 11 06:28:24 crc kubenswrapper[4628]: I1211 06:28:24.903002 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-v2lxt_dbdd3dcf-94cf-4b1e-9918-5d8efbe60360/manager/0.log" Dec 11 06:28:24 crc kubenswrapper[4628]: I1211 06:28:24.988439 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-9bcfl_f041b1fa-37ae-46fc-b6b0-301da06c1ff7/kube-rbac-proxy/0.log" Dec 11 06:28:25 crc kubenswrapper[4628]: I1211 06:28:25.062382 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-9bcfl_f041b1fa-37ae-46fc-b6b0-301da06c1ff7/manager/0.log" Dec 11 06:28:25 crc kubenswrapper[4628]: I1211 06:28:25.179673 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-vcz8d_232e8d69-426a-4259-93ab-1ebb4fa89a17/kube-rbac-proxy/0.log" Dec 11 06:28:25 crc kubenswrapper[4628]: I1211 06:28:25.263596 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-vcz8d_232e8d69-426a-4259-93ab-1ebb4fa89a17/manager/0.log" Dec 11 06:28:25 crc kubenswrapper[4628]: I1211 06:28:25.439267 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-6ff94_ae8e31fb-df50-4c43-af56-9c01af34f181/kube-rbac-proxy/0.log" Dec 11 06:28:25 crc kubenswrapper[4628]: I1211 06:28:25.673676 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-x4p8r_c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2/kube-rbac-proxy/0.log" Dec 11 06:28:25 crc kubenswrapper[4628]: I1211 06:28:25.754220 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-6ff94_ae8e31fb-df50-4c43-af56-9c01af34f181/manager/0.log" Dec 11 06:28:25 crc kubenswrapper[4628]: I1211 06:28:25.777390 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-x4p8r_c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2/manager/0.log" Dec 11 06:28:25 crc kubenswrapper[4628]: I1211 06:28:25.921999 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-z6dn7_d0e69cfa-5f08-4640-b9f8-b7c27ef8660f/kube-rbac-proxy/0.log" Dec 11 06:28:26 crc kubenswrapper[4628]: I1211 06:28:26.096172 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-z6dn7_d0e69cfa-5f08-4640-b9f8-b7c27ef8660f/manager/0.log" Dec 11 06:28:26 crc kubenswrapper[4628]: I1211 06:28:26.138221 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-bqcdn_c8063e93-9008-453c-805c-487456b5e0ac/kube-rbac-proxy/0.log" Dec 11 06:28:26 crc kubenswrapper[4628]: I1211 06:28:26.183101 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-bqcdn_c8063e93-9008-453c-805c-487456b5e0ac/manager/0.log" Dec 11 06:28:26 crc kubenswrapper[4628]: I1211 06:28:26.374372 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-jqz2j_88d0bbcc-5138-434d-811b-d8db056922cb/kube-rbac-proxy/0.log" Dec 11 06:28:26 crc kubenswrapper[4628]: I1211 06:28:26.496213 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-jqz2j_88d0bbcc-5138-434d-811b-d8db056922cb/manager/0.log" Dec 11 06:28:26 crc kubenswrapper[4628]: I1211 06:28:26.560114 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-cjb98_c2e9f8e4-3eda-4227-ad4a-8f8641f88612/kube-rbac-proxy/0.log" Dec 11 06:28:26 crc kubenswrapper[4628]: I1211 06:28:26.628272 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-cjb98_c2e9f8e4-3eda-4227-ad4a-8f8641f88612/manager/0.log" Dec 11 06:28:26 crc kubenswrapper[4628]: I1211 06:28:26.786028 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-w5xrs_d92dcd20-90f9-4499-bae5-f117cf41b4d5/kube-rbac-proxy/0.log" Dec 11 06:28:26 crc kubenswrapper[4628]: I1211 06:28:26.917412 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-w5xrs_d92dcd20-90f9-4499-bae5-f117cf41b4d5/manager/0.log" Dec 11 06:28:27 crc kubenswrapper[4628]: I1211 06:28:27.024525 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-vftnq_a7d3410e-df7b-4de8-aa0f-4c6de9e251e7/kube-rbac-proxy/0.log" Dec 11 06:28:27 crc kubenswrapper[4628]: I1211 06:28:27.072476 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-vftnq_a7d3410e-df7b-4de8-aa0f-4c6de9e251e7/manager/0.log" Dec 11 06:28:27 crc kubenswrapper[4628]: I1211 06:28:27.273714 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879f9w8m4_3112c087-1436-4f0a-8b0c-6000b07a0f77/kube-rbac-proxy/0.log" Dec 11 06:28:27 crc kubenswrapper[4628]: I1211 06:28:27.273983 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879f9w8m4_3112c087-1436-4f0a-8b0c-6000b07a0f77/manager/0.log" Dec 11 06:28:27 crc kubenswrapper[4628]: I1211 06:28:27.777203 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7fb58bb479-g2k9b_91ff2419-7fdf-4656-8d3a-69295ad50387/operator/0.log" Dec 11 06:28:27 crc kubenswrapper[4628]: I1211 06:28:27.803347 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zp8gf_589fc89a-de3e-4916-81a4-5972e3bd2410/registry-server/0.log" Dec 11 06:28:27 crc kubenswrapper[4628]: I1211 06:28:27.951132 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-nc2xx_bca7bee3-0202-48ba-b0e9-3353f6ab0938/kube-rbac-proxy/0.log" Dec 11 06:28:28 crc kubenswrapper[4628]: I1211 06:28:28.162810 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-nc2xx_bca7bee3-0202-48ba-b0e9-3353f6ab0938/manager/0.log" Dec 11 06:28:28 crc kubenswrapper[4628]: I1211 06:28:28.266242 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-qxsdk_ee29a0b0-46f9-45f6-b356-dde79504d5cc/kube-rbac-proxy/0.log" Dec 11 06:28:28 crc kubenswrapper[4628]: I1211 06:28:28.359672 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-qxsdk_ee29a0b0-46f9-45f6-b356-dde79504d5cc/manager/0.log" Dec 11 06:28:28 crc kubenswrapper[4628]: I1211 06:28:28.527837 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-l85kc_938faeea-3048-4d4a-8f3d-e22b31c73f47/operator/0.log" Dec 11 06:28:28 crc kubenswrapper[4628]: I1211 06:28:28.613814 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7546d6d447-f9qwn_c0ac60c7-7b87-490a-9107-ad5de9864845/manager/0.log" Dec 11 06:28:28 crc kubenswrapper[4628]: I1211 06:28:28.663907 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-zvjrq_2b786de1-276f-470c-b60a-e93596dd9e47/kube-rbac-proxy/0.log" Dec 11 06:28:28 crc kubenswrapper[4628]: I1211 06:28:28.840956 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-zvjrq_2b786de1-276f-470c-b60a-e93596dd9e47/manager/0.log" Dec 11 06:28:28 crc kubenswrapper[4628]: I1211 06:28:28.861329 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-2q4b9_2b9ef50b-db17-4df4-a936-5a02a25f61d7/kube-rbac-proxy/0.log" Dec 11 06:28:28 crc kubenswrapper[4628]: I1211 06:28:28.952174 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-2q4b9_2b9ef50b-db17-4df4-a936-5a02a25f61d7/manager/0.log" Dec 11 06:28:29 crc kubenswrapper[4628]: I1211 06:28:29.137970 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-tnvqg_9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe/kube-rbac-proxy/0.log" Dec 11 06:28:29 crc kubenswrapper[4628]: I1211 06:28:29.150747 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-tnvqg_9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe/manager/0.log" Dec 11 06:28:29 crc kubenswrapper[4628]: I1211 06:28:29.194929 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-l2wf4_53a3113c-a3d2-42c8-8ab8-b26b448a728a/kube-rbac-proxy/0.log" Dec 11 06:28:29 crc kubenswrapper[4628]: I1211 06:28:29.222089 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-l2wf4_53a3113c-a3d2-42c8-8ab8-b26b448a728a/manager/0.log" Dec 11 06:28:35 crc kubenswrapper[4628]: I1211 06:28:35.890111 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:28:35 crc kubenswrapper[4628]: E1211 06:28:35.890768 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:28:49 crc kubenswrapper[4628]: I1211 06:28:49.890413 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:28:49 crc kubenswrapper[4628]: E1211 06:28:49.891263 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:28:50 crc kubenswrapper[4628]: I1211 06:28:50.591745 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-78pgf_e22056a0-8001-488d-9dd7-9368d4a459e8/control-plane-machine-set-operator/0.log" Dec 11 06:28:51 crc kubenswrapper[4628]: I1211 06:28:51.151802 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lt4q5_575ba7ec-e024-40c7-be59-44a90232b4f2/machine-api-operator/0.log" Dec 11 06:28:51 crc kubenswrapper[4628]: I1211 06:28:51.152236 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lt4q5_575ba7ec-e024-40c7-be59-44a90232b4f2/kube-rbac-proxy/0.log" Dec 11 06:29:02 crc kubenswrapper[4628]: I1211 06:29:02.890010 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:29:02 crc kubenswrapper[4628]: E1211 06:29:02.890705 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:29:05 crc kubenswrapper[4628]: I1211 06:29:05.451794 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-gfsxf_6f9a6c48-1127-4e6c-bc88-133de5ba68e1/cert-manager-controller/0.log" Dec 11 06:29:05 crc kubenswrapper[4628]: I1211 06:29:05.644270 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-2fjb7_e08b2800-f237-4960-a939-b24a4f34b340/cert-manager-cainjector/0.log" Dec 11 06:29:05 crc kubenswrapper[4628]: I1211 06:29:05.701380 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-lf47f_2ab657c9-076c-4a39-9928-e92e8e276547/cert-manager-webhook/0.log" Dec 11 06:29:13 crc kubenswrapper[4628]: I1211 06:29:13.890201 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:29:13 crc kubenswrapper[4628]: E1211 06:29:13.891072 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:29:21 crc kubenswrapper[4628]: I1211 06:29:21.009288 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-mpq2p_6b801585-cb83-40f7-ab06-68951c4455c6/nmstate-console-plugin/0.log" Dec 11 06:29:21 crc kubenswrapper[4628]: I1211 06:29:21.257545 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-tqdqm_3a77bda3-3bb4-402d-a4a9-df9e47e8ff39/kube-rbac-proxy/0.log" Dec 11 06:29:21 crc kubenswrapper[4628]: I1211 06:29:21.271444 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lg7wb_c2471a0c-a9c4-4323-9fb6-e67872046a7d/nmstate-handler/0.log" Dec 11 06:29:21 crc kubenswrapper[4628]: I1211 06:29:21.354681 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-tqdqm_3a77bda3-3bb4-402d-a4a9-df9e47e8ff39/nmstate-metrics/0.log" Dec 11 06:29:21 crc kubenswrapper[4628]: I1211 06:29:21.502805 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-589fs_ae1d3899-1bda-4ad7-8512-9582b6fe2c54/nmstate-operator/0.log" Dec 11 06:29:21 crc kubenswrapper[4628]: I1211 06:29:21.629596 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-c54rd_ddeebf78-e410-4562-a173-563b43b1b322/nmstate-webhook/0.log" Dec 11 06:29:24 crc kubenswrapper[4628]: I1211 06:29:24.889694 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:29:24 crc kubenswrapper[4628]: E1211 06:29:24.890494 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:29:37 crc kubenswrapper[4628]: I1211 06:29:37.899729 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:29:37 crc kubenswrapper[4628]: E1211 06:29:37.904429 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:29:38 crc kubenswrapper[4628]: I1211 06:29:38.574746 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-q4v8q_7fbbee42-6c6f-4b6f-a8f4-a7acb4686612/kube-rbac-proxy/0.log" Dec 11 06:29:38 crc kubenswrapper[4628]: I1211 06:29:38.795645 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-q4v8q_7fbbee42-6c6f-4b6f-a8f4-a7acb4686612/controller/0.log" Dec 11 06:29:38 crc kubenswrapper[4628]: I1211 06:29:38.999298 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-frr-files/0.log" Dec 11 06:29:39 crc kubenswrapper[4628]: I1211 06:29:39.180265 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-reloader/0.log" Dec 11 06:29:39 crc kubenswrapper[4628]: I1211 06:29:39.181224 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-frr-files/0.log" Dec 11 06:29:39 crc kubenswrapper[4628]: I1211 06:29:39.230814 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-reloader/0.log" Dec 11 06:29:39 crc kubenswrapper[4628]: I1211 06:29:39.255417 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-metrics/0.log" Dec 11 06:29:39 crc kubenswrapper[4628]: I1211 06:29:39.493192 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-reloader/0.log" Dec 11 06:29:39 crc kubenswrapper[4628]: I1211 06:29:39.493245 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-metrics/0.log" Dec 11 06:29:39 crc kubenswrapper[4628]: I1211 06:29:39.537106 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-frr-files/0.log" Dec 11 06:29:39 crc kubenswrapper[4628]: I1211 06:29:39.545853 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-metrics/0.log" Dec 11 06:29:39 crc kubenswrapper[4628]: I1211 06:29:39.729681 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-reloader/0.log" Dec 11 06:29:39 crc kubenswrapper[4628]: I1211 06:29:39.751000 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-frr-files/0.log" Dec 11 06:29:39 crc kubenswrapper[4628]: I1211 06:29:39.788001 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/controller/0.log" Dec 11 06:29:39 crc kubenswrapper[4628]: I1211 06:29:39.810927 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-metrics/0.log" Dec 11 06:29:40 crc kubenswrapper[4628]: I1211 06:29:40.032764 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/frr-metrics/0.log" Dec 11 06:29:40 crc kubenswrapper[4628]: I1211 06:29:40.063732 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/kube-rbac-proxy/0.log" Dec 11 06:29:42 crc kubenswrapper[4628]: I1211 06:29:42.998561 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-5n5bs_abb0cb74-4b39-47f1-9a3a-cae28b6c32f6/frr-k8s-webhook-server/0.log" Dec 11 06:29:43 crc kubenswrapper[4628]: I1211 06:29:42.998909 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/kube-rbac-proxy-frr/0.log" Dec 11 06:29:43 crc kubenswrapper[4628]: I1211 06:29:43.002199 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/reloader/0.log" Dec 11 06:29:43 crc kubenswrapper[4628]: I1211 06:29:43.246786 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5f9c8b77b-f478p_8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e/manager/0.log" Dec 11 06:29:43 crc kubenswrapper[4628]: I1211 06:29:43.274670 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-68cfc95d7c-4ssjf_f91be819-4cd2-4c94-98a2-108b05ab0a23/webhook-server/0.log" Dec 11 06:29:43 crc kubenswrapper[4628]: I1211 06:29:43.584715 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v2km9_0cea36da-fd5c-416d-ab52-9500bc3fae0e/kube-rbac-proxy/0.log" Dec 11 06:29:44 crc kubenswrapper[4628]: I1211 06:29:44.179962 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v2km9_0cea36da-fd5c-416d-ab52-9500bc3fae0e/speaker/0.log" Dec 11 06:29:44 crc kubenswrapper[4628]: I1211 06:29:44.239297 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/frr/0.log" Dec 11 06:29:48 crc kubenswrapper[4628]: I1211 06:29:48.889065 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:29:48 crc kubenswrapper[4628]: E1211 06:29:48.890937 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:29:58 crc kubenswrapper[4628]: I1211 06:29:58.955926 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/util/0.log" Dec 11 06:29:58 crc kubenswrapper[4628]: I1211 06:29:58.956831 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/util/0.log" Dec 11 06:29:58 crc kubenswrapper[4628]: I1211 06:29:58.957735 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/pull/0.log" Dec 11 06:29:58 crc kubenswrapper[4628]: I1211 06:29:58.957819 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/pull/0.log" Dec 11 06:29:59 crc kubenswrapper[4628]: I1211 06:29:59.148012 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/util/0.log" Dec 11 06:29:59 crc kubenswrapper[4628]: I1211 06:29:59.182943 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/extract/0.log" Dec 11 06:29:59 crc kubenswrapper[4628]: I1211 06:29:59.192707 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/pull/0.log" Dec 11 06:29:59 crc kubenswrapper[4628]: I1211 06:29:59.336344 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/util/0.log" Dec 11 06:29:59 crc kubenswrapper[4628]: I1211 06:29:59.494202 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/util/0.log" Dec 11 06:29:59 crc kubenswrapper[4628]: I1211 06:29:59.565698 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/pull/0.log" Dec 11 06:29:59 crc kubenswrapper[4628]: I1211 06:29:59.575153 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/pull/0.log" Dec 11 06:29:59 crc kubenswrapper[4628]: I1211 06:29:59.704273 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/util/0.log" Dec 11 06:29:59 crc kubenswrapper[4628]: I1211 06:29:59.779523 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/pull/0.log" Dec 11 06:29:59 crc kubenswrapper[4628]: I1211 06:29:59.792277 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/extract/0.log" Dec 11 06:29:59 crc kubenswrapper[4628]: I1211 06:29:59.919147 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/extract-utilities/0.log" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.086387 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/extract-utilities/0.log" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.160239 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8"] Dec 11 06:30:00 crc kubenswrapper[4628]: E1211 06:30:00.160745 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355de5f2-1daa-4c8b-9073-07d5cabef696" containerName="container-00" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.160800 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="355de5f2-1daa-4c8b-9073-07d5cabef696" containerName="container-00" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.161119 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="355de5f2-1daa-4c8b-9073-07d5cabef696" containerName="container-00" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.165921 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.172560 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8"] Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.173547 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.174330 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.187631 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/extract-content/0.log" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.226469 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/extract-content/0.log" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.257404 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bt4c\" (UniqueName: \"kubernetes.io/projected/9d75a778-1531-40bf-b61b-eb6d236836ad-kube-api-access-6bt4c\") pod \"collect-profiles-29423910-wmld8\" (UID: \"9d75a778-1531-40bf-b61b-eb6d236836ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.257461 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d75a778-1531-40bf-b61b-eb6d236836ad-secret-volume\") pod \"collect-profiles-29423910-wmld8\" (UID: \"9d75a778-1531-40bf-b61b-eb6d236836ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.257578 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d75a778-1531-40bf-b61b-eb6d236836ad-config-volume\") pod \"collect-profiles-29423910-wmld8\" (UID: \"9d75a778-1531-40bf-b61b-eb6d236836ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.359739 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bt4c\" (UniqueName: \"kubernetes.io/projected/9d75a778-1531-40bf-b61b-eb6d236836ad-kube-api-access-6bt4c\") pod \"collect-profiles-29423910-wmld8\" (UID: \"9d75a778-1531-40bf-b61b-eb6d236836ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.359818 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d75a778-1531-40bf-b61b-eb6d236836ad-secret-volume\") pod \"collect-profiles-29423910-wmld8\" (UID: \"9d75a778-1531-40bf-b61b-eb6d236836ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.359868 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d75a778-1531-40bf-b61b-eb6d236836ad-config-volume\") pod \"collect-profiles-29423910-wmld8\" (UID: \"9d75a778-1531-40bf-b61b-eb6d236836ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.360783 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d75a778-1531-40bf-b61b-eb6d236836ad-config-volume\") pod \"collect-profiles-29423910-wmld8\" (UID: \"9d75a778-1531-40bf-b61b-eb6d236836ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.365241 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d75a778-1531-40bf-b61b-eb6d236836ad-secret-volume\") pod \"collect-profiles-29423910-wmld8\" (UID: \"9d75a778-1531-40bf-b61b-eb6d236836ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.378857 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bt4c\" (UniqueName: \"kubernetes.io/projected/9d75a778-1531-40bf-b61b-eb6d236836ad-kube-api-access-6bt4c\") pod \"collect-profiles-29423910-wmld8\" (UID: \"9d75a778-1531-40bf-b61b-eb6d236836ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.478129 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/extract-content/0.log" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.497916 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.508372 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/extract-utilities/0.log" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.690532 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/registry-server/0.log" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.819378 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/extract-utilities/0.log" Dec 11 06:30:00 crc kubenswrapper[4628]: I1211 06:30:00.961413 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8"] Dec 11 06:30:01 crc kubenswrapper[4628]: I1211 06:30:01.030463 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/extract-utilities/0.log" Dec 11 06:30:01 crc kubenswrapper[4628]: I1211 06:30:01.125737 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/extract-content/0.log" Dec 11 06:30:01 crc kubenswrapper[4628]: I1211 06:30:01.150944 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/extract-content/0.log" Dec 11 06:30:01 crc kubenswrapper[4628]: I1211 06:30:01.379419 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/extract-content/0.log" Dec 11 06:30:01 crc kubenswrapper[4628]: I1211 06:30:01.482954 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" event={"ID":"9d75a778-1531-40bf-b61b-eb6d236836ad","Type":"ContainerStarted","Data":"6379be4ef6805d594485353ace43e7a59acaa118dd1b69de0161b753a50278f7"} Dec 11 06:30:01 crc kubenswrapper[4628]: I1211 06:30:01.483007 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" event={"ID":"9d75a778-1531-40bf-b61b-eb6d236836ad","Type":"ContainerStarted","Data":"823317ce6da24b259e9f1a1e004b3e1d2479ac993427a9525faefeb009b3425e"} Dec 11 06:30:01 crc kubenswrapper[4628]: I1211 06:30:01.510294 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" podStartSLOduration=1.5102587349999999 podStartE2EDuration="1.510258735s" podCreationTimestamp="2025-12-11 06:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 06:30:01.501577634 +0000 UTC m=+4503.918924352" watchObservedRunningTime="2025-12-11 06:30:01.510258735 +0000 UTC m=+4503.927605433" Dec 11 06:30:01 crc kubenswrapper[4628]: I1211 06:30:01.545639 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/extract-utilities/0.log" Dec 11 06:30:01 crc kubenswrapper[4628]: I1211 06:30:01.768566 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8wvwr_7683eae0-a7bd-46c4-867e-b15d65fc5e7e/marketplace-operator/0.log" Dec 11 06:30:01 crc kubenswrapper[4628]: I1211 06:30:01.879395 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/extract-utilities/0.log" Dec 11 06:30:01 crc kubenswrapper[4628]: I1211 06:30:01.951284 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/registry-server/0.log" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.143148 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/extract-utilities/0.log" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.152022 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/extract-content/0.log" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.201361 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/extract-content/0.log" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.333956 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/extract-utilities/0.log" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.415440 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/extract-content/0.log" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.477074 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/extract-utilities/0.log" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.499585 4628 generic.go:334] "Generic (PLEG): container finished" podID="9d75a778-1531-40bf-b61b-eb6d236836ad" containerID="6379be4ef6805d594485353ace43e7a59acaa118dd1b69de0161b753a50278f7" exitCode=0 Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.499632 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" event={"ID":"9d75a778-1531-40bf-b61b-eb6d236836ad","Type":"ContainerDied","Data":"6379be4ef6805d594485353ace43e7a59acaa118dd1b69de0161b753a50278f7"} Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.571455 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/registry-server/0.log" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.659797 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/extract-content/0.log" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.670973 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/extract-utilities/0.log" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.698233 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/extract-content/0.log" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.889672 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:30:02 crc kubenswrapper[4628]: E1211 06:30:02.890224 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.894470 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/extract-content/0.log" Dec 11 06:30:02 crc kubenswrapper[4628]: I1211 06:30:02.914160 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/extract-utilities/0.log" Dec 11 06:30:03 crc kubenswrapper[4628]: I1211 06:30:03.498433 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/registry-server/0.log" Dec 11 06:30:03 crc kubenswrapper[4628]: I1211 06:30:03.861259 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:03 crc kubenswrapper[4628]: I1211 06:30:03.933583 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d75a778-1531-40bf-b61b-eb6d236836ad-secret-volume\") pod \"9d75a778-1531-40bf-b61b-eb6d236836ad\" (UID: \"9d75a778-1531-40bf-b61b-eb6d236836ad\") " Dec 11 06:30:03 crc kubenswrapper[4628]: I1211 06:30:03.933651 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bt4c\" (UniqueName: \"kubernetes.io/projected/9d75a778-1531-40bf-b61b-eb6d236836ad-kube-api-access-6bt4c\") pod \"9d75a778-1531-40bf-b61b-eb6d236836ad\" (UID: \"9d75a778-1531-40bf-b61b-eb6d236836ad\") " Dec 11 06:30:03 crc kubenswrapper[4628]: I1211 06:30:03.933694 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d75a778-1531-40bf-b61b-eb6d236836ad-config-volume\") pod \"9d75a778-1531-40bf-b61b-eb6d236836ad\" (UID: \"9d75a778-1531-40bf-b61b-eb6d236836ad\") " Dec 11 06:30:03 crc kubenswrapper[4628]: I1211 06:30:03.936519 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d75a778-1531-40bf-b61b-eb6d236836ad-config-volume" (OuterVolumeSpecName: "config-volume") pod "9d75a778-1531-40bf-b61b-eb6d236836ad" (UID: "9d75a778-1531-40bf-b61b-eb6d236836ad"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 06:30:03 crc kubenswrapper[4628]: I1211 06:30:03.942318 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d75a778-1531-40bf-b61b-eb6d236836ad-kube-api-access-6bt4c" (OuterVolumeSpecName: "kube-api-access-6bt4c") pod "9d75a778-1531-40bf-b61b-eb6d236836ad" (UID: "9d75a778-1531-40bf-b61b-eb6d236836ad"). InnerVolumeSpecName "kube-api-access-6bt4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:30:03 crc kubenswrapper[4628]: I1211 06:30:03.950026 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d75a778-1531-40bf-b61b-eb6d236836ad-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9d75a778-1531-40bf-b61b-eb6d236836ad" (UID: "9d75a778-1531-40bf-b61b-eb6d236836ad"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:30:04 crc kubenswrapper[4628]: I1211 06:30:04.035882 4628 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d75a778-1531-40bf-b61b-eb6d236836ad-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 06:30:04 crc kubenswrapper[4628]: I1211 06:30:04.035914 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bt4c\" (UniqueName: \"kubernetes.io/projected/9d75a778-1531-40bf-b61b-eb6d236836ad-kube-api-access-6bt4c\") on node \"crc\" DevicePath \"\"" Dec 11 06:30:04 crc kubenswrapper[4628]: I1211 06:30:04.035923 4628 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d75a778-1531-40bf-b61b-eb6d236836ad-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 06:30:04 crc kubenswrapper[4628]: I1211 06:30:04.519386 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" event={"ID":"9d75a778-1531-40bf-b61b-eb6d236836ad","Type":"ContainerDied","Data":"823317ce6da24b259e9f1a1e004b3e1d2479ac993427a9525faefeb009b3425e"} Dec 11 06:30:04 crc kubenswrapper[4628]: I1211 06:30:04.519424 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="823317ce6da24b259e9f1a1e004b3e1d2479ac993427a9525faefeb009b3425e" Dec 11 06:30:04 crc kubenswrapper[4628]: I1211 06:30:04.519473 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423910-wmld8" Dec 11 06:30:04 crc kubenswrapper[4628]: I1211 06:30:04.590280 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv"] Dec 11 06:30:04 crc kubenswrapper[4628]: I1211 06:30:04.603319 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423865-6h8xv"] Dec 11 06:30:05 crc kubenswrapper[4628]: I1211 06:30:05.912649 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a70a7e6-67fc-41c4-ab90-96aa3f3411f0" path="/var/lib/kubelet/pods/2a70a7e6-67fc-41c4-ab90-96aa3f3411f0/volumes" Dec 11 06:30:13 crc kubenswrapper[4628]: I1211 06:30:13.890352 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:30:13 crc kubenswrapper[4628]: E1211 06:30:13.891299 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:30:24 crc kubenswrapper[4628]: I1211 06:30:24.889474 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:30:24 crc kubenswrapper[4628]: E1211 06:30:24.890264 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:30:29 crc kubenswrapper[4628]: I1211 06:30:29.371546 4628 scope.go:117] "RemoveContainer" containerID="0663e4609851b140d73b7f7047af6f64934ff3a538afe9374f31e273207cb664" Dec 11 06:30:36 crc kubenswrapper[4628]: I1211 06:30:36.889939 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:30:36 crc kubenswrapper[4628]: E1211 06:30:36.890818 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:30:50 crc kubenswrapper[4628]: I1211 06:30:50.889991 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:30:50 crc kubenswrapper[4628]: E1211 06:30:50.890744 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:31:02 crc kubenswrapper[4628]: I1211 06:31:02.889077 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:31:02 crc kubenswrapper[4628]: E1211 06:31:02.890081 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:31:13 crc kubenswrapper[4628]: I1211 06:31:13.893314 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:31:13 crc kubenswrapper[4628]: E1211 06:31:13.894043 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:31:25 crc kubenswrapper[4628]: I1211 06:31:25.889493 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:31:25 crc kubenswrapper[4628]: E1211 06:31:25.890434 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.449544 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zn266"] Dec 11 06:31:35 crc kubenswrapper[4628]: E1211 06:31:35.450608 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d75a778-1531-40bf-b61b-eb6d236836ad" containerName="collect-profiles" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.450626 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d75a778-1531-40bf-b61b-eb6d236836ad" containerName="collect-profiles" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.450947 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d75a778-1531-40bf-b61b-eb6d236836ad" containerName="collect-profiles" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.452979 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.491091 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zn266"] Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.582995 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b2e271a-810c-454e-ac0f-41f9fd198348-utilities\") pod \"redhat-marketplace-zn266\" (UID: \"4b2e271a-810c-454e-ac0f-41f9fd198348\") " pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.583075 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b2e271a-810c-454e-ac0f-41f9fd198348-catalog-content\") pod \"redhat-marketplace-zn266\" (UID: \"4b2e271a-810c-454e-ac0f-41f9fd198348\") " pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.583129 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/4b2e271a-810c-454e-ac0f-41f9fd198348-kube-api-access-zh2mv\") pod \"redhat-marketplace-zn266\" (UID: \"4b2e271a-810c-454e-ac0f-41f9fd198348\") " pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.685234 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b2e271a-810c-454e-ac0f-41f9fd198348-catalog-content\") pod \"redhat-marketplace-zn266\" (UID: \"4b2e271a-810c-454e-ac0f-41f9fd198348\") " pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.685317 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/4b2e271a-810c-454e-ac0f-41f9fd198348-kube-api-access-zh2mv\") pod \"redhat-marketplace-zn266\" (UID: \"4b2e271a-810c-454e-ac0f-41f9fd198348\") " pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.685454 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b2e271a-810c-454e-ac0f-41f9fd198348-utilities\") pod \"redhat-marketplace-zn266\" (UID: \"4b2e271a-810c-454e-ac0f-41f9fd198348\") " pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.685963 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b2e271a-810c-454e-ac0f-41f9fd198348-utilities\") pod \"redhat-marketplace-zn266\" (UID: \"4b2e271a-810c-454e-ac0f-41f9fd198348\") " pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.686040 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b2e271a-810c-454e-ac0f-41f9fd198348-catalog-content\") pod \"redhat-marketplace-zn266\" (UID: \"4b2e271a-810c-454e-ac0f-41f9fd198348\") " pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.711941 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/4b2e271a-810c-454e-ac0f-41f9fd198348-kube-api-access-zh2mv\") pod \"redhat-marketplace-zn266\" (UID: \"4b2e271a-810c-454e-ac0f-41f9fd198348\") " pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:35 crc kubenswrapper[4628]: I1211 06:31:35.790920 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:36 crc kubenswrapper[4628]: I1211 06:31:36.339900 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zn266"] Dec 11 06:31:36 crc kubenswrapper[4628]: I1211 06:31:36.447779 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn266" event={"ID":"4b2e271a-810c-454e-ac0f-41f9fd198348","Type":"ContainerStarted","Data":"fd0554b414f546a84334b0ee25a82da2e47972c9fc30961243f635d1d44231b5"} Dec 11 06:31:37 crc kubenswrapper[4628]: I1211 06:31:37.466422 4628 generic.go:334] "Generic (PLEG): container finished" podID="4b2e271a-810c-454e-ac0f-41f9fd198348" containerID="b41971b1c0d792a14c70b2281aa3cde93b79600f25c3203c2f6c3d99817c99dc" exitCode=0 Dec 11 06:31:37 crc kubenswrapper[4628]: I1211 06:31:37.466535 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn266" event={"ID":"4b2e271a-810c-454e-ac0f-41f9fd198348","Type":"ContainerDied","Data":"b41971b1c0d792a14c70b2281aa3cde93b79600f25c3203c2f6c3d99817c99dc"} Dec 11 06:31:37 crc kubenswrapper[4628]: I1211 06:31:37.469380 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 06:31:38 crc kubenswrapper[4628]: I1211 06:31:38.477768 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn266" event={"ID":"4b2e271a-810c-454e-ac0f-41f9fd198348","Type":"ContainerStarted","Data":"a92f1427d884780f6f492c76c043414eac48399708b1c541177af1ca6a376e7e"} Dec 11 06:31:39 crc kubenswrapper[4628]: I1211 06:31:39.491837 4628 generic.go:334] "Generic (PLEG): container finished" podID="4b2e271a-810c-454e-ac0f-41f9fd198348" containerID="a92f1427d884780f6f492c76c043414eac48399708b1c541177af1ca6a376e7e" exitCode=0 Dec 11 06:31:39 crc kubenswrapper[4628]: I1211 06:31:39.491942 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn266" event={"ID":"4b2e271a-810c-454e-ac0f-41f9fd198348","Type":"ContainerDied","Data":"a92f1427d884780f6f492c76c043414eac48399708b1c541177af1ca6a376e7e"} Dec 11 06:31:40 crc kubenswrapper[4628]: I1211 06:31:40.506900 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn266" event={"ID":"4b2e271a-810c-454e-ac0f-41f9fd198348","Type":"ContainerStarted","Data":"7f313b86ae33f5e3d1e6e4690d5a8491519aa4d9306e8fc6581bd592a9424e66"} Dec 11 06:31:40 crc kubenswrapper[4628]: I1211 06:31:40.534865 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zn266" podStartSLOduration=3.025479086 podStartE2EDuration="5.534808851s" podCreationTimestamp="2025-12-11 06:31:35 +0000 UTC" firstStartedPulling="2025-12-11 06:31:37.468786842 +0000 UTC m=+4599.886133580" lastFinishedPulling="2025-12-11 06:31:39.978116647 +0000 UTC m=+4602.395463345" observedRunningTime="2025-12-11 06:31:40.524911638 +0000 UTC m=+4602.942258336" watchObservedRunningTime="2025-12-11 06:31:40.534808851 +0000 UTC m=+4602.952155549" Dec 11 06:31:40 crc kubenswrapper[4628]: I1211 06:31:40.889383 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:31:42 crc kubenswrapper[4628]: I1211 06:31:42.529452 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"88388dbd3453cd64d96e82a99d6a17e7eeefa79420b0d46578dd03a105614740"} Dec 11 06:31:45 crc kubenswrapper[4628]: I1211 06:31:45.791477 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:45 crc kubenswrapper[4628]: I1211 06:31:45.792202 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:45 crc kubenswrapper[4628]: I1211 06:31:45.836277 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:46 crc kubenswrapper[4628]: I1211 06:31:46.652814 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:46 crc kubenswrapper[4628]: I1211 06:31:46.717372 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zn266"] Dec 11 06:31:48 crc kubenswrapper[4628]: I1211 06:31:48.588758 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zn266" podUID="4b2e271a-810c-454e-ac0f-41f9fd198348" containerName="registry-server" containerID="cri-o://7f313b86ae33f5e3d1e6e4690d5a8491519aa4d9306e8fc6581bd592a9424e66" gracePeriod=2 Dec 11 06:31:49 crc kubenswrapper[4628]: I1211 06:31:49.627677 4628 generic.go:334] "Generic (PLEG): container finished" podID="4b2e271a-810c-454e-ac0f-41f9fd198348" containerID="7f313b86ae33f5e3d1e6e4690d5a8491519aa4d9306e8fc6581bd592a9424e66" exitCode=0 Dec 11 06:31:49 crc kubenswrapper[4628]: I1211 06:31:49.627723 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn266" event={"ID":"4b2e271a-810c-454e-ac0f-41f9fd198348","Type":"ContainerDied","Data":"7f313b86ae33f5e3d1e6e4690d5a8491519aa4d9306e8fc6581bd592a9424e66"} Dec 11 06:31:49 crc kubenswrapper[4628]: I1211 06:31:49.627800 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn266" event={"ID":"4b2e271a-810c-454e-ac0f-41f9fd198348","Type":"ContainerDied","Data":"fd0554b414f546a84334b0ee25a82da2e47972c9fc30961243f635d1d44231b5"} Dec 11 06:31:49 crc kubenswrapper[4628]: I1211 06:31:49.627819 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd0554b414f546a84334b0ee25a82da2e47972c9fc30961243f635d1d44231b5" Dec 11 06:31:49 crc kubenswrapper[4628]: I1211 06:31:49.637530 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:49 crc kubenswrapper[4628]: I1211 06:31:49.714451 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b2e271a-810c-454e-ac0f-41f9fd198348-catalog-content\") pod \"4b2e271a-810c-454e-ac0f-41f9fd198348\" (UID: \"4b2e271a-810c-454e-ac0f-41f9fd198348\") " Dec 11 06:31:49 crc kubenswrapper[4628]: I1211 06:31:49.714832 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b2e271a-810c-454e-ac0f-41f9fd198348-utilities\") pod \"4b2e271a-810c-454e-ac0f-41f9fd198348\" (UID: \"4b2e271a-810c-454e-ac0f-41f9fd198348\") " Dec 11 06:31:49 crc kubenswrapper[4628]: I1211 06:31:49.715124 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/4b2e271a-810c-454e-ac0f-41f9fd198348-kube-api-access-zh2mv\") pod \"4b2e271a-810c-454e-ac0f-41f9fd198348\" (UID: \"4b2e271a-810c-454e-ac0f-41f9fd198348\") " Dec 11 06:31:49 crc kubenswrapper[4628]: I1211 06:31:49.715718 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b2e271a-810c-454e-ac0f-41f9fd198348-utilities" (OuterVolumeSpecName: "utilities") pod "4b2e271a-810c-454e-ac0f-41f9fd198348" (UID: "4b2e271a-810c-454e-ac0f-41f9fd198348"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:31:49 crc kubenswrapper[4628]: I1211 06:31:49.734150 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b2e271a-810c-454e-ac0f-41f9fd198348-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b2e271a-810c-454e-ac0f-41f9fd198348" (UID: "4b2e271a-810c-454e-ac0f-41f9fd198348"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:31:49 crc kubenswrapper[4628]: I1211 06:31:49.817576 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b2e271a-810c-454e-ac0f-41f9fd198348-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:31:49 crc kubenswrapper[4628]: I1211 06:31:49.817610 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b2e271a-810c-454e-ac0f-41f9fd198348-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:31:50 crc kubenswrapper[4628]: I1211 06:31:50.293190 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2e271a-810c-454e-ac0f-41f9fd198348-kube-api-access-zh2mv" (OuterVolumeSpecName: "kube-api-access-zh2mv") pod "4b2e271a-810c-454e-ac0f-41f9fd198348" (UID: "4b2e271a-810c-454e-ac0f-41f9fd198348"). InnerVolumeSpecName "kube-api-access-zh2mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:31:50 crc kubenswrapper[4628]: I1211 06:31:50.329263 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/4b2e271a-810c-454e-ac0f-41f9fd198348-kube-api-access-zh2mv\") on node \"crc\" DevicePath \"\"" Dec 11 06:31:50 crc kubenswrapper[4628]: I1211 06:31:50.650095 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zn266" Dec 11 06:31:50 crc kubenswrapper[4628]: I1211 06:31:50.700400 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zn266"] Dec 11 06:31:50 crc kubenswrapper[4628]: I1211 06:31:50.708581 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zn266"] Dec 11 06:31:51 crc kubenswrapper[4628]: I1211 06:31:51.909192 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b2e271a-810c-454e-ac0f-41f9fd198348" path="/var/lib/kubelet/pods/4b2e271a-810c-454e-ac0f-41f9fd198348/volumes" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.246692 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lvdq7"] Dec 11 06:32:05 crc kubenswrapper[4628]: E1211 06:32:05.248059 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2e271a-810c-454e-ac0f-41f9fd198348" containerName="extract-content" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.248083 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2e271a-810c-454e-ac0f-41f9fd198348" containerName="extract-content" Dec 11 06:32:05 crc kubenswrapper[4628]: E1211 06:32:05.248124 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2e271a-810c-454e-ac0f-41f9fd198348" containerName="extract-utilities" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.248135 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2e271a-810c-454e-ac0f-41f9fd198348" containerName="extract-utilities" Dec 11 06:32:05 crc kubenswrapper[4628]: E1211 06:32:05.248169 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2e271a-810c-454e-ac0f-41f9fd198348" containerName="registry-server" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.248182 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2e271a-810c-454e-ac0f-41f9fd198348" containerName="registry-server" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.248473 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b2e271a-810c-454e-ac0f-41f9fd198348" containerName="registry-server" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.250333 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.277201 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lvdq7"] Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.409565 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5707350-780e-4092-9454-26ffa111288b-catalog-content\") pod \"community-operators-lvdq7\" (UID: \"f5707350-780e-4092-9454-26ffa111288b\") " pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.409730 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5707350-780e-4092-9454-26ffa111288b-utilities\") pod \"community-operators-lvdq7\" (UID: \"f5707350-780e-4092-9454-26ffa111288b\") " pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.414924 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt6ft\" (UniqueName: \"kubernetes.io/projected/f5707350-780e-4092-9454-26ffa111288b-kube-api-access-gt6ft\") pod \"community-operators-lvdq7\" (UID: \"f5707350-780e-4092-9454-26ffa111288b\") " pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.517359 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt6ft\" (UniqueName: \"kubernetes.io/projected/f5707350-780e-4092-9454-26ffa111288b-kube-api-access-gt6ft\") pod \"community-operators-lvdq7\" (UID: \"f5707350-780e-4092-9454-26ffa111288b\") " pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.517479 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5707350-780e-4092-9454-26ffa111288b-catalog-content\") pod \"community-operators-lvdq7\" (UID: \"f5707350-780e-4092-9454-26ffa111288b\") " pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.517526 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5707350-780e-4092-9454-26ffa111288b-utilities\") pod \"community-operators-lvdq7\" (UID: \"f5707350-780e-4092-9454-26ffa111288b\") " pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.518345 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5707350-780e-4092-9454-26ffa111288b-utilities\") pod \"community-operators-lvdq7\" (UID: \"f5707350-780e-4092-9454-26ffa111288b\") " pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.518560 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5707350-780e-4092-9454-26ffa111288b-catalog-content\") pod \"community-operators-lvdq7\" (UID: \"f5707350-780e-4092-9454-26ffa111288b\") " pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.538965 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt6ft\" (UniqueName: \"kubernetes.io/projected/f5707350-780e-4092-9454-26ffa111288b-kube-api-access-gt6ft\") pod \"community-operators-lvdq7\" (UID: \"f5707350-780e-4092-9454-26ffa111288b\") " pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:05 crc kubenswrapper[4628]: I1211 06:32:05.584142 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:06 crc kubenswrapper[4628]: I1211 06:32:06.173561 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lvdq7"] Dec 11 06:32:06 crc kubenswrapper[4628]: I1211 06:32:06.790758 4628 generic.go:334] "Generic (PLEG): container finished" podID="f5707350-780e-4092-9454-26ffa111288b" containerID="5d42b475a6b634b0d445a97c3db7dd3b875c6d7f6cfe12cd49f5c1dc5ea3b959" exitCode=0 Dec 11 06:32:06 crc kubenswrapper[4628]: I1211 06:32:06.790828 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvdq7" event={"ID":"f5707350-780e-4092-9454-26ffa111288b","Type":"ContainerDied","Data":"5d42b475a6b634b0d445a97c3db7dd3b875c6d7f6cfe12cd49f5c1dc5ea3b959"} Dec 11 06:32:06 crc kubenswrapper[4628]: I1211 06:32:06.791066 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvdq7" event={"ID":"f5707350-780e-4092-9454-26ffa111288b","Type":"ContainerStarted","Data":"0449e8f1b0daa58dc305961caf0458a58755cc2e08e4b49237be6b63290f756d"} Dec 11 06:32:07 crc kubenswrapper[4628]: I1211 06:32:07.800853 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvdq7" event={"ID":"f5707350-780e-4092-9454-26ffa111288b","Type":"ContainerStarted","Data":"7cace17fcadc7c5087a5bf4ef13f7946ba13212fff32b6588002ea80fc375c15"} Dec 11 06:32:08 crc kubenswrapper[4628]: I1211 06:32:08.813750 4628 generic.go:334] "Generic (PLEG): container finished" podID="f5707350-780e-4092-9454-26ffa111288b" containerID="7cace17fcadc7c5087a5bf4ef13f7946ba13212fff32b6588002ea80fc375c15" exitCode=0 Dec 11 06:32:08 crc kubenswrapper[4628]: I1211 06:32:08.813817 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvdq7" event={"ID":"f5707350-780e-4092-9454-26ffa111288b","Type":"ContainerDied","Data":"7cace17fcadc7c5087a5bf4ef13f7946ba13212fff32b6588002ea80fc375c15"} Dec 11 06:32:10 crc kubenswrapper[4628]: I1211 06:32:10.841403 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvdq7" event={"ID":"f5707350-780e-4092-9454-26ffa111288b","Type":"ContainerStarted","Data":"05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b"} Dec 11 06:32:10 crc kubenswrapper[4628]: I1211 06:32:10.876126 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lvdq7" podStartSLOduration=2.801749689 podStartE2EDuration="5.876102328s" podCreationTimestamp="2025-12-11 06:32:05 +0000 UTC" firstStartedPulling="2025-12-11 06:32:06.79414711 +0000 UTC m=+4629.211493808" lastFinishedPulling="2025-12-11 06:32:09.868499749 +0000 UTC m=+4632.285846447" observedRunningTime="2025-12-11 06:32:10.865723462 +0000 UTC m=+4633.283070160" watchObservedRunningTime="2025-12-11 06:32:10.876102328 +0000 UTC m=+4633.293449026" Dec 11 06:32:15 crc kubenswrapper[4628]: I1211 06:32:15.584504 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:15 crc kubenswrapper[4628]: I1211 06:32:15.585111 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:15 crc kubenswrapper[4628]: I1211 06:32:15.757025 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:15 crc kubenswrapper[4628]: I1211 06:32:15.979586 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:16 crc kubenswrapper[4628]: I1211 06:32:16.033738 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lvdq7"] Dec 11 06:32:17 crc kubenswrapper[4628]: I1211 06:32:17.915765 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lvdq7" podUID="f5707350-780e-4092-9454-26ffa111288b" containerName="registry-server" containerID="cri-o://05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b" gracePeriod=2 Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.380374 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.570352 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5707350-780e-4092-9454-26ffa111288b-catalog-content\") pod \"f5707350-780e-4092-9454-26ffa111288b\" (UID: \"f5707350-780e-4092-9454-26ffa111288b\") " Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.570427 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5707350-780e-4092-9454-26ffa111288b-utilities\") pod \"f5707350-780e-4092-9454-26ffa111288b\" (UID: \"f5707350-780e-4092-9454-26ffa111288b\") " Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.570611 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt6ft\" (UniqueName: \"kubernetes.io/projected/f5707350-780e-4092-9454-26ffa111288b-kube-api-access-gt6ft\") pod \"f5707350-780e-4092-9454-26ffa111288b\" (UID: \"f5707350-780e-4092-9454-26ffa111288b\") " Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.571968 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5707350-780e-4092-9454-26ffa111288b-utilities" (OuterVolumeSpecName: "utilities") pod "f5707350-780e-4092-9454-26ffa111288b" (UID: "f5707350-780e-4092-9454-26ffa111288b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.585126 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5707350-780e-4092-9454-26ffa111288b-kube-api-access-gt6ft" (OuterVolumeSpecName: "kube-api-access-gt6ft") pod "f5707350-780e-4092-9454-26ffa111288b" (UID: "f5707350-780e-4092-9454-26ffa111288b"). InnerVolumeSpecName "kube-api-access-gt6ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.616725 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5707350-780e-4092-9454-26ffa111288b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5707350-780e-4092-9454-26ffa111288b" (UID: "f5707350-780e-4092-9454-26ffa111288b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.672487 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt6ft\" (UniqueName: \"kubernetes.io/projected/f5707350-780e-4092-9454-26ffa111288b-kube-api-access-gt6ft\") on node \"crc\" DevicePath \"\"" Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.672698 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5707350-780e-4092-9454-26ffa111288b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.672768 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5707350-780e-4092-9454-26ffa111288b-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.925058 4628 generic.go:334] "Generic (PLEG): container finished" podID="f5707350-780e-4092-9454-26ffa111288b" containerID="05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b" exitCode=0 Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.925100 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvdq7" event={"ID":"f5707350-780e-4092-9454-26ffa111288b","Type":"ContainerDied","Data":"05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b"} Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.925125 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvdq7" event={"ID":"f5707350-780e-4092-9454-26ffa111288b","Type":"ContainerDied","Data":"0449e8f1b0daa58dc305961caf0458a58755cc2e08e4b49237be6b63290f756d"} Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.925124 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvdq7" Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.925169 4628 scope.go:117] "RemoveContainer" containerID="05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b" Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.944247 4628 scope.go:117] "RemoveContainer" containerID="7cace17fcadc7c5087a5bf4ef13f7946ba13212fff32b6588002ea80fc375c15" Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.970378 4628 scope.go:117] "RemoveContainer" containerID="5d42b475a6b634b0d445a97c3db7dd3b875c6d7f6cfe12cd49f5c1dc5ea3b959" Dec 11 06:32:18 crc kubenswrapper[4628]: I1211 06:32:18.991906 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lvdq7"] Dec 11 06:32:19 crc kubenswrapper[4628]: I1211 06:32:19.020599 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lvdq7"] Dec 11 06:32:19 crc kubenswrapper[4628]: I1211 06:32:19.024193 4628 scope.go:117] "RemoveContainer" containerID="05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b" Dec 11 06:32:19 crc kubenswrapper[4628]: E1211 06:32:19.029263 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b\": container with ID starting with 05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b not found: ID does not exist" containerID="05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b" Dec 11 06:32:19 crc kubenswrapper[4628]: I1211 06:32:19.029338 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b"} err="failed to get container status \"05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b\": rpc error: code = NotFound desc = could not find container \"05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b\": container with ID starting with 05dfa7fcd51dfa354bf915fe04354419836ab190e66e3d35344373b61863f16b not found: ID does not exist" Dec 11 06:32:19 crc kubenswrapper[4628]: I1211 06:32:19.029367 4628 scope.go:117] "RemoveContainer" containerID="7cace17fcadc7c5087a5bf4ef13f7946ba13212fff32b6588002ea80fc375c15" Dec 11 06:32:19 crc kubenswrapper[4628]: E1211 06:32:19.033412 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cace17fcadc7c5087a5bf4ef13f7946ba13212fff32b6588002ea80fc375c15\": container with ID starting with 7cace17fcadc7c5087a5bf4ef13f7946ba13212fff32b6588002ea80fc375c15 not found: ID does not exist" containerID="7cace17fcadc7c5087a5bf4ef13f7946ba13212fff32b6588002ea80fc375c15" Dec 11 06:32:19 crc kubenswrapper[4628]: I1211 06:32:19.033640 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cace17fcadc7c5087a5bf4ef13f7946ba13212fff32b6588002ea80fc375c15"} err="failed to get container status \"7cace17fcadc7c5087a5bf4ef13f7946ba13212fff32b6588002ea80fc375c15\": rpc error: code = NotFound desc = could not find container \"7cace17fcadc7c5087a5bf4ef13f7946ba13212fff32b6588002ea80fc375c15\": container with ID starting with 7cace17fcadc7c5087a5bf4ef13f7946ba13212fff32b6588002ea80fc375c15 not found: ID does not exist" Dec 11 06:32:19 crc kubenswrapper[4628]: I1211 06:32:19.033727 4628 scope.go:117] "RemoveContainer" containerID="5d42b475a6b634b0d445a97c3db7dd3b875c6d7f6cfe12cd49f5c1dc5ea3b959" Dec 11 06:32:19 crc kubenswrapper[4628]: E1211 06:32:19.034045 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d42b475a6b634b0d445a97c3db7dd3b875c6d7f6cfe12cd49f5c1dc5ea3b959\": container with ID starting with 5d42b475a6b634b0d445a97c3db7dd3b875c6d7f6cfe12cd49f5c1dc5ea3b959 not found: ID does not exist" containerID="5d42b475a6b634b0d445a97c3db7dd3b875c6d7f6cfe12cd49f5c1dc5ea3b959" Dec 11 06:32:19 crc kubenswrapper[4628]: I1211 06:32:19.034113 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d42b475a6b634b0d445a97c3db7dd3b875c6d7f6cfe12cd49f5c1dc5ea3b959"} err="failed to get container status \"5d42b475a6b634b0d445a97c3db7dd3b875c6d7f6cfe12cd49f5c1dc5ea3b959\": rpc error: code = NotFound desc = could not find container \"5d42b475a6b634b0d445a97c3db7dd3b875c6d7f6cfe12cd49f5c1dc5ea3b959\": container with ID starting with 5d42b475a6b634b0d445a97c3db7dd3b875c6d7f6cfe12cd49f5c1dc5ea3b959 not found: ID does not exist" Dec 11 06:32:19 crc kubenswrapper[4628]: I1211 06:32:19.913728 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5707350-780e-4092-9454-26ffa111288b" path="/var/lib/kubelet/pods/f5707350-780e-4092-9454-26ffa111288b/volumes" Dec 11 06:32:29 crc kubenswrapper[4628]: I1211 06:32:29.444516 4628 scope.go:117] "RemoveContainer" containerID="9fa8d7c9af79bfc3fe3320865334f1b60091dbb7252158eeaa3d570e62b63b2b" Dec 11 06:32:31 crc kubenswrapper[4628]: I1211 06:32:31.064209 4628 generic.go:334] "Generic (PLEG): container finished" podID="166f5e85-d96f-490c-9606-f7df3d47809c" containerID="7ffa5292350464bd1bc6babce2b2cd0a4db290d7dd03af9290f2401bb17484cb" exitCode=0 Dec 11 06:32:31 crc kubenswrapper[4628]: I1211 06:32:31.064329 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x7dhv/must-gather-k96rn" event={"ID":"166f5e85-d96f-490c-9606-f7df3d47809c","Type":"ContainerDied","Data":"7ffa5292350464bd1bc6babce2b2cd0a4db290d7dd03af9290f2401bb17484cb"} Dec 11 06:32:31 crc kubenswrapper[4628]: I1211 06:32:31.066344 4628 scope.go:117] "RemoveContainer" containerID="7ffa5292350464bd1bc6babce2b2cd0a4db290d7dd03af9290f2401bb17484cb" Dec 11 06:32:31 crc kubenswrapper[4628]: I1211 06:32:31.723344 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x7dhv_must-gather-k96rn_166f5e85-d96f-490c-9606-f7df3d47809c/gather/0.log" Dec 11 06:32:39 crc kubenswrapper[4628]: I1211 06:32:39.843822 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x7dhv/must-gather-k96rn"] Dec 11 06:32:39 crc kubenswrapper[4628]: I1211 06:32:39.844749 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-x7dhv/must-gather-k96rn" podUID="166f5e85-d96f-490c-9606-f7df3d47809c" containerName="copy" containerID="cri-o://3d576c4f365aa6fff953fed907aa6081e0435bae1d727df7161b962880881e3e" gracePeriod=2 Dec 11 06:32:39 crc kubenswrapper[4628]: I1211 06:32:39.852649 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x7dhv/must-gather-k96rn"] Dec 11 06:32:40 crc kubenswrapper[4628]: I1211 06:32:40.147399 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x7dhv_must-gather-k96rn_166f5e85-d96f-490c-9606-f7df3d47809c/copy/0.log" Dec 11 06:32:40 crc kubenswrapper[4628]: I1211 06:32:40.148503 4628 generic.go:334] "Generic (PLEG): container finished" podID="166f5e85-d96f-490c-9606-f7df3d47809c" containerID="3d576c4f365aa6fff953fed907aa6081e0435bae1d727df7161b962880881e3e" exitCode=143 Dec 11 06:32:40 crc kubenswrapper[4628]: I1211 06:32:40.263311 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x7dhv_must-gather-k96rn_166f5e85-d96f-490c-9606-f7df3d47809c/copy/0.log" Dec 11 06:32:40 crc kubenswrapper[4628]: I1211 06:32:40.263611 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/must-gather-k96rn" Dec 11 06:32:40 crc kubenswrapper[4628]: I1211 06:32:40.412342 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tbsh\" (UniqueName: \"kubernetes.io/projected/166f5e85-d96f-490c-9606-f7df3d47809c-kube-api-access-9tbsh\") pod \"166f5e85-d96f-490c-9606-f7df3d47809c\" (UID: \"166f5e85-d96f-490c-9606-f7df3d47809c\") " Dec 11 06:32:40 crc kubenswrapper[4628]: I1211 06:32:40.413314 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/166f5e85-d96f-490c-9606-f7df3d47809c-must-gather-output\") pod \"166f5e85-d96f-490c-9606-f7df3d47809c\" (UID: \"166f5e85-d96f-490c-9606-f7df3d47809c\") " Dec 11 06:32:40 crc kubenswrapper[4628]: I1211 06:32:40.417295 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/166f5e85-d96f-490c-9606-f7df3d47809c-kube-api-access-9tbsh" (OuterVolumeSpecName: "kube-api-access-9tbsh") pod "166f5e85-d96f-490c-9606-f7df3d47809c" (UID: "166f5e85-d96f-490c-9606-f7df3d47809c"). InnerVolumeSpecName "kube-api-access-9tbsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:32:40 crc kubenswrapper[4628]: I1211 06:32:40.516108 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tbsh\" (UniqueName: \"kubernetes.io/projected/166f5e85-d96f-490c-9606-f7df3d47809c-kube-api-access-9tbsh\") on node \"crc\" DevicePath \"\"" Dec 11 06:32:40 crc kubenswrapper[4628]: I1211 06:32:40.641199 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/166f5e85-d96f-490c-9606-f7df3d47809c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "166f5e85-d96f-490c-9606-f7df3d47809c" (UID: "166f5e85-d96f-490c-9606-f7df3d47809c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:32:40 crc kubenswrapper[4628]: I1211 06:32:40.719391 4628 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/166f5e85-d96f-490c-9606-f7df3d47809c-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 11 06:32:41 crc kubenswrapper[4628]: I1211 06:32:41.157568 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x7dhv_must-gather-k96rn_166f5e85-d96f-490c-9606-f7df3d47809c/copy/0.log" Dec 11 06:32:41 crc kubenswrapper[4628]: I1211 06:32:41.158237 4628 scope.go:117] "RemoveContainer" containerID="3d576c4f365aa6fff953fed907aa6081e0435bae1d727df7161b962880881e3e" Dec 11 06:32:41 crc kubenswrapper[4628]: I1211 06:32:41.158296 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x7dhv/must-gather-k96rn" Dec 11 06:32:41 crc kubenswrapper[4628]: I1211 06:32:41.191604 4628 scope.go:117] "RemoveContainer" containerID="7ffa5292350464bd1bc6babce2b2cd0a4db290d7dd03af9290f2401bb17484cb" Dec 11 06:32:41 crc kubenswrapper[4628]: I1211 06:32:41.902964 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="166f5e85-d96f-490c-9606-f7df3d47809c" path="/var/lib/kubelet/pods/166f5e85-d96f-490c-9606-f7df3d47809c/volumes" Dec 11 06:34:01 crc kubenswrapper[4628]: I1211 06:34:01.426577 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:34:01 crc kubenswrapper[4628]: I1211 06:34:01.427495 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:34:31 crc kubenswrapper[4628]: I1211 06:34:31.427124 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:34:31 crc kubenswrapper[4628]: I1211 06:34:31.427744 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:35:01 crc kubenswrapper[4628]: I1211 06:35:01.426510 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:35:01 crc kubenswrapper[4628]: I1211 06:35:01.427952 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:35:01 crc kubenswrapper[4628]: I1211 06:35:01.428060 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 06:35:01 crc kubenswrapper[4628]: I1211 06:35:01.428793 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88388dbd3453cd64d96e82a99d6a17e7eeefa79420b0d46578dd03a105614740"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 06:35:01 crc kubenswrapper[4628]: I1211 06:35:01.429044 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://88388dbd3453cd64d96e82a99d6a17e7eeefa79420b0d46578dd03a105614740" gracePeriod=600 Dec 11 06:35:01 crc kubenswrapper[4628]: I1211 06:35:01.947496 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="88388dbd3453cd64d96e82a99d6a17e7eeefa79420b0d46578dd03a105614740" exitCode=0 Dec 11 06:35:01 crc kubenswrapper[4628]: I1211 06:35:01.947545 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"88388dbd3453cd64d96e82a99d6a17e7eeefa79420b0d46578dd03a105614740"} Dec 11 06:35:01 crc kubenswrapper[4628]: I1211 06:35:01.947839 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16"} Dec 11 06:35:01 crc kubenswrapper[4628]: I1211 06:35:01.947877 4628 scope.go:117] "RemoveContainer" containerID="cbb7b1afb5c77aeb6e87801d57ce1a9dd38bfd8d5e9c920ab34076b7d7ac55aa" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.691514 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lwvpk/must-gather-jfb8l"] Dec 11 06:35:54 crc kubenswrapper[4628]: E1211 06:35:54.692585 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5707350-780e-4092-9454-26ffa111288b" containerName="extract-content" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.692604 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5707350-780e-4092-9454-26ffa111288b" containerName="extract-content" Dec 11 06:35:54 crc kubenswrapper[4628]: E1211 06:35:54.692637 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166f5e85-d96f-490c-9606-f7df3d47809c" containerName="gather" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.692645 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="166f5e85-d96f-490c-9606-f7df3d47809c" containerName="gather" Dec 11 06:35:54 crc kubenswrapper[4628]: E1211 06:35:54.692665 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166f5e85-d96f-490c-9606-f7df3d47809c" containerName="copy" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.692673 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="166f5e85-d96f-490c-9606-f7df3d47809c" containerName="copy" Dec 11 06:35:54 crc kubenswrapper[4628]: E1211 06:35:54.692690 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5707350-780e-4092-9454-26ffa111288b" containerName="registry-server" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.692698 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5707350-780e-4092-9454-26ffa111288b" containerName="registry-server" Dec 11 06:35:54 crc kubenswrapper[4628]: E1211 06:35:54.692717 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5707350-780e-4092-9454-26ffa111288b" containerName="extract-utilities" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.692725 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5707350-780e-4092-9454-26ffa111288b" containerName="extract-utilities" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.693001 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5707350-780e-4092-9454-26ffa111288b" containerName="registry-server" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.693019 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="166f5e85-d96f-490c-9606-f7df3d47809c" containerName="copy" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.693040 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="166f5e85-d96f-490c-9606-f7df3d47809c" containerName="gather" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.694257 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/must-gather-jfb8l" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.702506 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lwvpk"/"openshift-service-ca.crt" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.704742 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lwvpk"/"kube-root-ca.crt" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.707561 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lwvpk/must-gather-jfb8l"] Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.752684 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tbbj\" (UniqueName: \"kubernetes.io/projected/7e0799c7-7ac6-46c5-8478-4edc7de737d2-kube-api-access-8tbbj\") pod \"must-gather-jfb8l\" (UID: \"7e0799c7-7ac6-46c5-8478-4edc7de737d2\") " pod="openshift-must-gather-lwvpk/must-gather-jfb8l" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.752893 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e0799c7-7ac6-46c5-8478-4edc7de737d2-must-gather-output\") pod \"must-gather-jfb8l\" (UID: \"7e0799c7-7ac6-46c5-8478-4edc7de737d2\") " pod="openshift-must-gather-lwvpk/must-gather-jfb8l" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.854158 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e0799c7-7ac6-46c5-8478-4edc7de737d2-must-gather-output\") pod \"must-gather-jfb8l\" (UID: \"7e0799c7-7ac6-46c5-8478-4edc7de737d2\") " pod="openshift-must-gather-lwvpk/must-gather-jfb8l" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.854599 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e0799c7-7ac6-46c5-8478-4edc7de737d2-must-gather-output\") pod \"must-gather-jfb8l\" (UID: \"7e0799c7-7ac6-46c5-8478-4edc7de737d2\") " pod="openshift-must-gather-lwvpk/must-gather-jfb8l" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.872191 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tbbj\" (UniqueName: \"kubernetes.io/projected/7e0799c7-7ac6-46c5-8478-4edc7de737d2-kube-api-access-8tbbj\") pod \"must-gather-jfb8l\" (UID: \"7e0799c7-7ac6-46c5-8478-4edc7de737d2\") " pod="openshift-must-gather-lwvpk/must-gather-jfb8l" Dec 11 06:35:54 crc kubenswrapper[4628]: I1211 06:35:54.879374 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tbbj\" (UniqueName: \"kubernetes.io/projected/7e0799c7-7ac6-46c5-8478-4edc7de737d2-kube-api-access-8tbbj\") pod \"must-gather-jfb8l\" (UID: \"7e0799c7-7ac6-46c5-8478-4edc7de737d2\") " pod="openshift-must-gather-lwvpk/must-gather-jfb8l" Dec 11 06:35:55 crc kubenswrapper[4628]: I1211 06:35:55.018106 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/must-gather-jfb8l" Dec 11 06:35:55 crc kubenswrapper[4628]: I1211 06:35:55.538170 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lwvpk/must-gather-jfb8l"] Dec 11 06:35:56 crc kubenswrapper[4628]: I1211 06:35:56.493315 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwvpk/must-gather-jfb8l" event={"ID":"7e0799c7-7ac6-46c5-8478-4edc7de737d2","Type":"ContainerStarted","Data":"2fcb3b907ca57da1b4593792d38b6eeb6d2b08e31804336667a54ef870b9f8ef"} Dec 11 06:35:56 crc kubenswrapper[4628]: I1211 06:35:56.494719 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwvpk/must-gather-jfb8l" event={"ID":"7e0799c7-7ac6-46c5-8478-4edc7de737d2","Type":"ContainerStarted","Data":"501554b130ab89d0b7ee4cc093ca6400532a7b899a4fc71fb3f1c92cf39c0e0b"} Dec 11 06:35:56 crc kubenswrapper[4628]: I1211 06:35:56.494823 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwvpk/must-gather-jfb8l" event={"ID":"7e0799c7-7ac6-46c5-8478-4edc7de737d2","Type":"ContainerStarted","Data":"2dde863e0291c83422aac11d1c2a1448d7a0f137b7faf722e15fced3847b1fa9"} Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.247575 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lwvpk/must-gather-jfb8l" podStartSLOduration=4.247552823 podStartE2EDuration="4.247552823s" podCreationTimestamp="2025-12-11 06:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 06:35:56.515100197 +0000 UTC m=+4858.932446895" watchObservedRunningTime="2025-12-11 06:35:58.247552823 +0000 UTC m=+4860.664899521" Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.252747 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ll2gt"] Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.254703 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.269331 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ll2gt"] Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.339477 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vpls\" (UniqueName: \"kubernetes.io/projected/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-kube-api-access-9vpls\") pod \"redhat-operators-ll2gt\" (UID: \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\") " pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.339544 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-catalog-content\") pod \"redhat-operators-ll2gt\" (UID: \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\") " pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.339565 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-utilities\") pod \"redhat-operators-ll2gt\" (UID: \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\") " pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.441299 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-catalog-content\") pod \"redhat-operators-ll2gt\" (UID: \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\") " pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.441376 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-utilities\") pod \"redhat-operators-ll2gt\" (UID: \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\") " pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.441619 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vpls\" (UniqueName: \"kubernetes.io/projected/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-kube-api-access-9vpls\") pod \"redhat-operators-ll2gt\" (UID: \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\") " pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.442595 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-catalog-content\") pod \"redhat-operators-ll2gt\" (UID: \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\") " pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.442766 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-utilities\") pod \"redhat-operators-ll2gt\" (UID: \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\") " pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.472824 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vpls\" (UniqueName: \"kubernetes.io/projected/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-kube-api-access-9vpls\") pod \"redhat-operators-ll2gt\" (UID: \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\") " pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:35:58 crc kubenswrapper[4628]: I1211 06:35:58.576626 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:35:59 crc kubenswrapper[4628]: I1211 06:35:59.169720 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ll2gt"] Dec 11 06:35:59 crc kubenswrapper[4628]: I1211 06:35:59.520022 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll2gt" event={"ID":"36cda578-d08d-4ad1-bdb6-d4aed234c6e9","Type":"ContainerStarted","Data":"fe9dc6d64b5695c684bc49f9d0347e1866959d1cfba1a9544877ac4bf8f1ce5f"} Dec 11 06:36:00 crc kubenswrapper[4628]: I1211 06:36:00.530866 4628 generic.go:334] "Generic (PLEG): container finished" podID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" containerID="06c28bb59ca7aa870393a40b1468edb5525cdf8966b6a1a90903cd300d9ed434" exitCode=0 Dec 11 06:36:00 crc kubenswrapper[4628]: I1211 06:36:00.530990 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll2gt" event={"ID":"36cda578-d08d-4ad1-bdb6-d4aed234c6e9","Type":"ContainerDied","Data":"06c28bb59ca7aa870393a40b1468edb5525cdf8966b6a1a90903cd300d9ed434"} Dec 11 06:36:00 crc kubenswrapper[4628]: I1211 06:36:00.751946 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lwvpk/crc-debug-ww4tm"] Dec 11 06:36:00 crc kubenswrapper[4628]: I1211 06:36:00.753099 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" Dec 11 06:36:00 crc kubenswrapper[4628]: I1211 06:36:00.755265 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lwvpk"/"default-dockercfg-7hxds" Dec 11 06:36:00 crc kubenswrapper[4628]: I1211 06:36:00.786873 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcbf9\" (UniqueName: \"kubernetes.io/projected/1631206f-7c4d-4d8c-83e4-0f79a8dc403a-kube-api-access-xcbf9\") pod \"crc-debug-ww4tm\" (UID: \"1631206f-7c4d-4d8c-83e4-0f79a8dc403a\") " pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" Dec 11 06:36:00 crc kubenswrapper[4628]: I1211 06:36:00.787011 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1631206f-7c4d-4d8c-83e4-0f79a8dc403a-host\") pod \"crc-debug-ww4tm\" (UID: \"1631206f-7c4d-4d8c-83e4-0f79a8dc403a\") " pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" Dec 11 06:36:00 crc kubenswrapper[4628]: I1211 06:36:00.888402 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcbf9\" (UniqueName: \"kubernetes.io/projected/1631206f-7c4d-4d8c-83e4-0f79a8dc403a-kube-api-access-xcbf9\") pod \"crc-debug-ww4tm\" (UID: \"1631206f-7c4d-4d8c-83e4-0f79a8dc403a\") " pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" Dec 11 06:36:00 crc kubenswrapper[4628]: I1211 06:36:00.888608 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1631206f-7c4d-4d8c-83e4-0f79a8dc403a-host\") pod \"crc-debug-ww4tm\" (UID: \"1631206f-7c4d-4d8c-83e4-0f79a8dc403a\") " pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" Dec 11 06:36:00 crc kubenswrapper[4628]: I1211 06:36:00.888802 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1631206f-7c4d-4d8c-83e4-0f79a8dc403a-host\") pod \"crc-debug-ww4tm\" (UID: \"1631206f-7c4d-4d8c-83e4-0f79a8dc403a\") " pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" Dec 11 06:36:00 crc kubenswrapper[4628]: I1211 06:36:00.914417 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcbf9\" (UniqueName: \"kubernetes.io/projected/1631206f-7c4d-4d8c-83e4-0f79a8dc403a-kube-api-access-xcbf9\") pod \"crc-debug-ww4tm\" (UID: \"1631206f-7c4d-4d8c-83e4-0f79a8dc403a\") " pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" Dec 11 06:36:01 crc kubenswrapper[4628]: I1211 06:36:01.069951 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" Dec 11 06:36:01 crc kubenswrapper[4628]: I1211 06:36:01.543780 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" event={"ID":"1631206f-7c4d-4d8c-83e4-0f79a8dc403a","Type":"ContainerStarted","Data":"a2b1e53ce5b142164803e1c7cca6fa0725ead78f89906a3e7abe963da38234ff"} Dec 11 06:36:01 crc kubenswrapper[4628]: I1211 06:36:01.544374 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" event={"ID":"1631206f-7c4d-4d8c-83e4-0f79a8dc403a","Type":"ContainerStarted","Data":"ddf680499cfdb3c54a35273ccf6e23b4716ef03fce3010362d58d55c3e3202d1"} Dec 11 06:36:01 crc kubenswrapper[4628]: I1211 06:36:01.571301 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" podStartSLOduration=1.5712746260000001 podStartE2EDuration="1.571274626s" podCreationTimestamp="2025-12-11 06:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 06:36:01.565269764 +0000 UTC m=+4863.982616472" watchObservedRunningTime="2025-12-11 06:36:01.571274626 +0000 UTC m=+4863.988621364" Dec 11 06:36:02 crc kubenswrapper[4628]: I1211 06:36:02.558600 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll2gt" event={"ID":"36cda578-d08d-4ad1-bdb6-d4aed234c6e9","Type":"ContainerStarted","Data":"7aeefa2eca896185e8b04e30afdd64205ae1872990a412c45ef64816ea37337e"} Dec 11 06:36:05 crc kubenswrapper[4628]: I1211 06:36:05.591212 4628 generic.go:334] "Generic (PLEG): container finished" podID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" containerID="7aeefa2eca896185e8b04e30afdd64205ae1872990a412c45ef64816ea37337e" exitCode=0 Dec 11 06:36:05 crc kubenswrapper[4628]: I1211 06:36:05.591282 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll2gt" event={"ID":"36cda578-d08d-4ad1-bdb6-d4aed234c6e9","Type":"ContainerDied","Data":"7aeefa2eca896185e8b04e30afdd64205ae1872990a412c45ef64816ea37337e"} Dec 11 06:36:07 crc kubenswrapper[4628]: I1211 06:36:07.619312 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll2gt" event={"ID":"36cda578-d08d-4ad1-bdb6-d4aed234c6e9","Type":"ContainerStarted","Data":"2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf"} Dec 11 06:36:07 crc kubenswrapper[4628]: I1211 06:36:07.649284 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ll2gt" podStartSLOduration=3.724148451 podStartE2EDuration="9.649266365s" podCreationTimestamp="2025-12-11 06:35:58 +0000 UTC" firstStartedPulling="2025-12-11 06:36:00.5329141 +0000 UTC m=+4862.950260798" lastFinishedPulling="2025-12-11 06:36:06.458032014 +0000 UTC m=+4868.875378712" observedRunningTime="2025-12-11 06:36:07.641168046 +0000 UTC m=+4870.058514744" watchObservedRunningTime="2025-12-11 06:36:07.649266365 +0000 UTC m=+4870.066613063" Dec 11 06:36:08 crc kubenswrapper[4628]: I1211 06:36:08.577789 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:36:08 crc kubenswrapper[4628]: I1211 06:36:08.578450 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:36:09 crc kubenswrapper[4628]: I1211 06:36:09.636233 4628 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ll2gt" podUID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" containerName="registry-server" probeResult="failure" output=< Dec 11 06:36:09 crc kubenswrapper[4628]: timeout: failed to connect service ":50051" within 1s Dec 11 06:36:09 crc kubenswrapper[4628]: > Dec 11 06:36:18 crc kubenswrapper[4628]: I1211 06:36:18.846142 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:36:18 crc kubenswrapper[4628]: I1211 06:36:18.896411 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:36:19 crc kubenswrapper[4628]: I1211 06:36:19.089495 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ll2gt"] Dec 11 06:36:20 crc kubenswrapper[4628]: I1211 06:36:20.747132 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ll2gt" podUID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" containerName="registry-server" containerID="cri-o://2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf" gracePeriod=2 Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.270223 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.379714 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-catalog-content\") pod \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\" (UID: \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\") " Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.379770 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-utilities\") pod \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\" (UID: \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\") " Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.379888 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vpls\" (UniqueName: \"kubernetes.io/projected/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-kube-api-access-9vpls\") pod \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\" (UID: \"36cda578-d08d-4ad1-bdb6-d4aed234c6e9\") " Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.381123 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-utilities" (OuterVolumeSpecName: "utilities") pod "36cda578-d08d-4ad1-bdb6-d4aed234c6e9" (UID: "36cda578-d08d-4ad1-bdb6-d4aed234c6e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.385794 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-kube-api-access-9vpls" (OuterVolumeSpecName: "kube-api-access-9vpls") pod "36cda578-d08d-4ad1-bdb6-d4aed234c6e9" (UID: "36cda578-d08d-4ad1-bdb6-d4aed234c6e9"). InnerVolumeSpecName "kube-api-access-9vpls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.482441 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vpls\" (UniqueName: \"kubernetes.io/projected/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-kube-api-access-9vpls\") on node \"crc\" DevicePath \"\"" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.482476 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.511308 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36cda578-d08d-4ad1-bdb6-d4aed234c6e9" (UID: "36cda578-d08d-4ad1-bdb6-d4aed234c6e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.584591 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36cda578-d08d-4ad1-bdb6-d4aed234c6e9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.761185 4628 generic.go:334] "Generic (PLEG): container finished" podID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" containerID="2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf" exitCode=0 Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.761229 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll2gt" event={"ID":"36cda578-d08d-4ad1-bdb6-d4aed234c6e9","Type":"ContainerDied","Data":"2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf"} Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.761260 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ll2gt" event={"ID":"36cda578-d08d-4ad1-bdb6-d4aed234c6e9","Type":"ContainerDied","Data":"fe9dc6d64b5695c684bc49f9d0347e1866959d1cfba1a9544877ac4bf8f1ce5f"} Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.761280 4628 scope.go:117] "RemoveContainer" containerID="2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.761429 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ll2gt" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.807749 4628 scope.go:117] "RemoveContainer" containerID="7aeefa2eca896185e8b04e30afdd64205ae1872990a412c45ef64816ea37337e" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.814809 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ll2gt"] Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.826085 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ll2gt"] Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.831801 4628 scope.go:117] "RemoveContainer" containerID="06c28bb59ca7aa870393a40b1468edb5525cdf8966b6a1a90903cd300d9ed434" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.886661 4628 scope.go:117] "RemoveContainer" containerID="2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf" Dec 11 06:36:21 crc kubenswrapper[4628]: E1211 06:36:21.887065 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf\": container with ID starting with 2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf not found: ID does not exist" containerID="2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.887133 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf"} err="failed to get container status \"2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf\": rpc error: code = NotFound desc = could not find container \"2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf\": container with ID starting with 2d410fec42708c4f3273c78148ec9714992ba399237eb7d36eaba535f31e62bf not found: ID does not exist" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.887154 4628 scope.go:117] "RemoveContainer" containerID="7aeefa2eca896185e8b04e30afdd64205ae1872990a412c45ef64816ea37337e" Dec 11 06:36:21 crc kubenswrapper[4628]: E1211 06:36:21.887410 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aeefa2eca896185e8b04e30afdd64205ae1872990a412c45ef64816ea37337e\": container with ID starting with 7aeefa2eca896185e8b04e30afdd64205ae1872990a412c45ef64816ea37337e not found: ID does not exist" containerID="7aeefa2eca896185e8b04e30afdd64205ae1872990a412c45ef64816ea37337e" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.887438 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aeefa2eca896185e8b04e30afdd64205ae1872990a412c45ef64816ea37337e"} err="failed to get container status \"7aeefa2eca896185e8b04e30afdd64205ae1872990a412c45ef64816ea37337e\": rpc error: code = NotFound desc = could not find container \"7aeefa2eca896185e8b04e30afdd64205ae1872990a412c45ef64816ea37337e\": container with ID starting with 7aeefa2eca896185e8b04e30afdd64205ae1872990a412c45ef64816ea37337e not found: ID does not exist" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.887453 4628 scope.go:117] "RemoveContainer" containerID="06c28bb59ca7aa870393a40b1468edb5525cdf8966b6a1a90903cd300d9ed434" Dec 11 06:36:21 crc kubenswrapper[4628]: E1211 06:36:21.887625 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c28bb59ca7aa870393a40b1468edb5525cdf8966b6a1a90903cd300d9ed434\": container with ID starting with 06c28bb59ca7aa870393a40b1468edb5525cdf8966b6a1a90903cd300d9ed434 not found: ID does not exist" containerID="06c28bb59ca7aa870393a40b1468edb5525cdf8966b6a1a90903cd300d9ed434" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.887650 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c28bb59ca7aa870393a40b1468edb5525cdf8966b6a1a90903cd300d9ed434"} err="failed to get container status \"06c28bb59ca7aa870393a40b1468edb5525cdf8966b6a1a90903cd300d9ed434\": rpc error: code = NotFound desc = could not find container \"06c28bb59ca7aa870393a40b1468edb5525cdf8966b6a1a90903cd300d9ed434\": container with ID starting with 06c28bb59ca7aa870393a40b1468edb5525cdf8966b6a1a90903cd300d9ed434 not found: ID does not exist" Dec 11 06:36:21 crc kubenswrapper[4628]: I1211 06:36:21.903480 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" path="/var/lib/kubelet/pods/36cda578-d08d-4ad1-bdb6-d4aed234c6e9/volumes" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.160700 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r4527"] Dec 11 06:36:35 crc kubenswrapper[4628]: E1211 06:36:35.161600 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" containerName="extract-utilities" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.161613 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" containerName="extract-utilities" Dec 11 06:36:35 crc kubenswrapper[4628]: E1211 06:36:35.161634 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" containerName="extract-content" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.161640 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" containerName="extract-content" Dec 11 06:36:35 crc kubenswrapper[4628]: E1211 06:36:35.161646 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" containerName="registry-server" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.161652 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" containerName="registry-server" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.161928 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="36cda578-d08d-4ad1-bdb6-d4aed234c6e9" containerName="registry-server" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.163197 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.180775 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r4527"] Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.243039 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5149c12a-fa98-4588-9fda-4c8471752df1-utilities\") pod \"certified-operators-r4527\" (UID: \"5149c12a-fa98-4588-9fda-4c8471752df1\") " pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.243162 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5149c12a-fa98-4588-9fda-4c8471752df1-catalog-content\") pod \"certified-operators-r4527\" (UID: \"5149c12a-fa98-4588-9fda-4c8471752df1\") " pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.243203 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwj2q\" (UniqueName: \"kubernetes.io/projected/5149c12a-fa98-4588-9fda-4c8471752df1-kube-api-access-vwj2q\") pod \"certified-operators-r4527\" (UID: \"5149c12a-fa98-4588-9fda-4c8471752df1\") " pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.345212 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5149c12a-fa98-4588-9fda-4c8471752df1-catalog-content\") pod \"certified-operators-r4527\" (UID: \"5149c12a-fa98-4588-9fda-4c8471752df1\") " pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.345282 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwj2q\" (UniqueName: \"kubernetes.io/projected/5149c12a-fa98-4588-9fda-4c8471752df1-kube-api-access-vwj2q\") pod \"certified-operators-r4527\" (UID: \"5149c12a-fa98-4588-9fda-4c8471752df1\") " pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.345343 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5149c12a-fa98-4588-9fda-4c8471752df1-utilities\") pod \"certified-operators-r4527\" (UID: \"5149c12a-fa98-4588-9fda-4c8471752df1\") " pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.345788 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5149c12a-fa98-4588-9fda-4c8471752df1-catalog-content\") pod \"certified-operators-r4527\" (UID: \"5149c12a-fa98-4588-9fda-4c8471752df1\") " pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.347125 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5149c12a-fa98-4588-9fda-4c8471752df1-utilities\") pod \"certified-operators-r4527\" (UID: \"5149c12a-fa98-4588-9fda-4c8471752df1\") " pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.389806 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwj2q\" (UniqueName: \"kubernetes.io/projected/5149c12a-fa98-4588-9fda-4c8471752df1-kube-api-access-vwj2q\") pod \"certified-operators-r4527\" (UID: \"5149c12a-fa98-4588-9fda-4c8471752df1\") " pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:35 crc kubenswrapper[4628]: I1211 06:36:35.484915 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:36 crc kubenswrapper[4628]: I1211 06:36:36.091901 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r4527"] Dec 11 06:36:36 crc kubenswrapper[4628]: E1211 06:36:36.567174 4628 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5149c12a_fa98_4588_9fda_4c8471752df1.slice/crio-conmon-af72156e3f2c115f390abcf492eabe7ebe23432abe70e86c4c5bd21686eb4792.scope\": RecentStats: unable to find data in memory cache]" Dec 11 06:36:36 crc kubenswrapper[4628]: I1211 06:36:36.902354 4628 generic.go:334] "Generic (PLEG): container finished" podID="5149c12a-fa98-4588-9fda-4c8471752df1" containerID="af72156e3f2c115f390abcf492eabe7ebe23432abe70e86c4c5bd21686eb4792" exitCode=0 Dec 11 06:36:36 crc kubenswrapper[4628]: I1211 06:36:36.902634 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4527" event={"ID":"5149c12a-fa98-4588-9fda-4c8471752df1","Type":"ContainerDied","Data":"af72156e3f2c115f390abcf492eabe7ebe23432abe70e86c4c5bd21686eb4792"} Dec 11 06:36:36 crc kubenswrapper[4628]: I1211 06:36:36.902659 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4527" event={"ID":"5149c12a-fa98-4588-9fda-4c8471752df1","Type":"ContainerStarted","Data":"65de236fae772dc8140504fbc2146f2b9a71fd230935e7ab4a819ddf55e30a4a"} Dec 11 06:36:38 crc kubenswrapper[4628]: I1211 06:36:38.938807 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4527" event={"ID":"5149c12a-fa98-4588-9fda-4c8471752df1","Type":"ContainerStarted","Data":"5dceba55c5f92179737abfae28454084d2135d85d40cc93212b70812113109be"} Dec 11 06:36:39 crc kubenswrapper[4628]: I1211 06:36:39.953198 4628 generic.go:334] "Generic (PLEG): container finished" podID="5149c12a-fa98-4588-9fda-4c8471752df1" containerID="5dceba55c5f92179737abfae28454084d2135d85d40cc93212b70812113109be" exitCode=0 Dec 11 06:36:39 crc kubenswrapper[4628]: I1211 06:36:39.953309 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4527" event={"ID":"5149c12a-fa98-4588-9fda-4c8471752df1","Type":"ContainerDied","Data":"5dceba55c5f92179737abfae28454084d2135d85d40cc93212b70812113109be"} Dec 11 06:36:39 crc kubenswrapper[4628]: I1211 06:36:39.956145 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 06:36:40 crc kubenswrapper[4628]: I1211 06:36:40.963771 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4527" event={"ID":"5149c12a-fa98-4588-9fda-4c8471752df1","Type":"ContainerStarted","Data":"0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a"} Dec 11 06:36:40 crc kubenswrapper[4628]: I1211 06:36:40.984675 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r4527" podStartSLOduration=2.419226031 podStartE2EDuration="5.984654999s" podCreationTimestamp="2025-12-11 06:36:35 +0000 UTC" firstStartedPulling="2025-12-11 06:36:36.905005748 +0000 UTC m=+4899.322352446" lastFinishedPulling="2025-12-11 06:36:40.470434716 +0000 UTC m=+4902.887781414" observedRunningTime="2025-12-11 06:36:40.984228188 +0000 UTC m=+4903.401574886" watchObservedRunningTime="2025-12-11 06:36:40.984654999 +0000 UTC m=+4903.402001707" Dec 11 06:36:45 crc kubenswrapper[4628]: I1211 06:36:45.486014 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:45 crc kubenswrapper[4628]: I1211 06:36:45.486555 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:45 crc kubenswrapper[4628]: I1211 06:36:45.549482 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:46 crc kubenswrapper[4628]: I1211 06:36:46.098516 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:46 crc kubenswrapper[4628]: I1211 06:36:46.158556 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r4527"] Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.057785 4628 generic.go:334] "Generic (PLEG): container finished" podID="1631206f-7c4d-4d8c-83e4-0f79a8dc403a" containerID="a2b1e53ce5b142164803e1c7cca6fa0725ead78f89906a3e7abe963da38234ff" exitCode=0 Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.060487 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" event={"ID":"1631206f-7c4d-4d8c-83e4-0f79a8dc403a","Type":"ContainerDied","Data":"a2b1e53ce5b142164803e1c7cca6fa0725ead78f89906a3e7abe963da38234ff"} Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.061194 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r4527" podUID="5149c12a-fa98-4588-9fda-4c8471752df1" containerName="registry-server" containerID="cri-o://0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a" gracePeriod=2 Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.588817 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.720828 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5149c12a-fa98-4588-9fda-4c8471752df1-catalog-content\") pod \"5149c12a-fa98-4588-9fda-4c8471752df1\" (UID: \"5149c12a-fa98-4588-9fda-4c8471752df1\") " Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.721085 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5149c12a-fa98-4588-9fda-4c8471752df1-utilities\") pod \"5149c12a-fa98-4588-9fda-4c8471752df1\" (UID: \"5149c12a-fa98-4588-9fda-4c8471752df1\") " Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.721182 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwj2q\" (UniqueName: \"kubernetes.io/projected/5149c12a-fa98-4588-9fda-4c8471752df1-kube-api-access-vwj2q\") pod \"5149c12a-fa98-4588-9fda-4c8471752df1\" (UID: \"5149c12a-fa98-4588-9fda-4c8471752df1\") " Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.721750 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5149c12a-fa98-4588-9fda-4c8471752df1-utilities" (OuterVolumeSpecName: "utilities") pod "5149c12a-fa98-4588-9fda-4c8471752df1" (UID: "5149c12a-fa98-4588-9fda-4c8471752df1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.729064 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5149c12a-fa98-4588-9fda-4c8471752df1-kube-api-access-vwj2q" (OuterVolumeSpecName: "kube-api-access-vwj2q") pod "5149c12a-fa98-4588-9fda-4c8471752df1" (UID: "5149c12a-fa98-4588-9fda-4c8471752df1"). InnerVolumeSpecName "kube-api-access-vwj2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.772979 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5149c12a-fa98-4588-9fda-4c8471752df1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5149c12a-fa98-4588-9fda-4c8471752df1" (UID: "5149c12a-fa98-4588-9fda-4c8471752df1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.828180 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5149c12a-fa98-4588-9fda-4c8471752df1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.828214 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5149c12a-fa98-4588-9fda-4c8471752df1-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:36:48 crc kubenswrapper[4628]: I1211 06:36:48.828224 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwj2q\" (UniqueName: \"kubernetes.io/projected/5149c12a-fa98-4588-9fda-4c8471752df1-kube-api-access-vwj2q\") on node \"crc\" DevicePath \"\"" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.068248 4628 generic.go:334] "Generic (PLEG): container finished" podID="5149c12a-fa98-4588-9fda-4c8471752df1" containerID="0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a" exitCode=0 Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.068318 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r4527" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.068350 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4527" event={"ID":"5149c12a-fa98-4588-9fda-4c8471752df1","Type":"ContainerDied","Data":"0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a"} Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.068398 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4527" event={"ID":"5149c12a-fa98-4588-9fda-4c8471752df1","Type":"ContainerDied","Data":"65de236fae772dc8140504fbc2146f2b9a71fd230935e7ab4a819ddf55e30a4a"} Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.068416 4628 scope.go:117] "RemoveContainer" containerID="0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.132086 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.133060 4628 scope.go:117] "RemoveContainer" containerID="5dceba55c5f92179737abfae28454084d2135d85d40cc93212b70812113109be" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.155433 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r4527"] Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.167024 4628 scope.go:117] "RemoveContainer" containerID="af72156e3f2c115f390abcf492eabe7ebe23432abe70e86c4c5bd21686eb4792" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.169895 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r4527"] Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.214970 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lwvpk/crc-debug-ww4tm"] Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.225607 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lwvpk/crc-debug-ww4tm"] Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.227924 4628 scope.go:117] "RemoveContainer" containerID="0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a" Dec 11 06:36:49 crc kubenswrapper[4628]: E1211 06:36:49.228399 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a\": container with ID starting with 0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a not found: ID does not exist" containerID="0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.228442 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a"} err="failed to get container status \"0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a\": rpc error: code = NotFound desc = could not find container \"0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a\": container with ID starting with 0db397e734392f9ed7063db2b85103d147992623d92f60fd035b2136b8887e6a not found: ID does not exist" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.228468 4628 scope.go:117] "RemoveContainer" containerID="5dceba55c5f92179737abfae28454084d2135d85d40cc93212b70812113109be" Dec 11 06:36:49 crc kubenswrapper[4628]: E1211 06:36:49.228822 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dceba55c5f92179737abfae28454084d2135d85d40cc93212b70812113109be\": container with ID starting with 5dceba55c5f92179737abfae28454084d2135d85d40cc93212b70812113109be not found: ID does not exist" containerID="5dceba55c5f92179737abfae28454084d2135d85d40cc93212b70812113109be" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.228880 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dceba55c5f92179737abfae28454084d2135d85d40cc93212b70812113109be"} err="failed to get container status \"5dceba55c5f92179737abfae28454084d2135d85d40cc93212b70812113109be\": rpc error: code = NotFound desc = could not find container \"5dceba55c5f92179737abfae28454084d2135d85d40cc93212b70812113109be\": container with ID starting with 5dceba55c5f92179737abfae28454084d2135d85d40cc93212b70812113109be not found: ID does not exist" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.228913 4628 scope.go:117] "RemoveContainer" containerID="af72156e3f2c115f390abcf492eabe7ebe23432abe70e86c4c5bd21686eb4792" Dec 11 06:36:49 crc kubenswrapper[4628]: E1211 06:36:49.229804 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af72156e3f2c115f390abcf492eabe7ebe23432abe70e86c4c5bd21686eb4792\": container with ID starting with af72156e3f2c115f390abcf492eabe7ebe23432abe70e86c4c5bd21686eb4792 not found: ID does not exist" containerID="af72156e3f2c115f390abcf492eabe7ebe23432abe70e86c4c5bd21686eb4792" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.229877 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af72156e3f2c115f390abcf492eabe7ebe23432abe70e86c4c5bd21686eb4792"} err="failed to get container status \"af72156e3f2c115f390abcf492eabe7ebe23432abe70e86c4c5bd21686eb4792\": rpc error: code = NotFound desc = could not find container \"af72156e3f2c115f390abcf492eabe7ebe23432abe70e86c4c5bd21686eb4792\": container with ID starting with af72156e3f2c115f390abcf492eabe7ebe23432abe70e86c4c5bd21686eb4792 not found: ID does not exist" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.236758 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcbf9\" (UniqueName: \"kubernetes.io/projected/1631206f-7c4d-4d8c-83e4-0f79a8dc403a-kube-api-access-xcbf9\") pod \"1631206f-7c4d-4d8c-83e4-0f79a8dc403a\" (UID: \"1631206f-7c4d-4d8c-83e4-0f79a8dc403a\") " Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.236999 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1631206f-7c4d-4d8c-83e4-0f79a8dc403a-host\") pod \"1631206f-7c4d-4d8c-83e4-0f79a8dc403a\" (UID: \"1631206f-7c4d-4d8c-83e4-0f79a8dc403a\") " Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.237689 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1631206f-7c4d-4d8c-83e4-0f79a8dc403a-host" (OuterVolumeSpecName: "host") pod "1631206f-7c4d-4d8c-83e4-0f79a8dc403a" (UID: "1631206f-7c4d-4d8c-83e4-0f79a8dc403a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.244268 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1631206f-7c4d-4d8c-83e4-0f79a8dc403a-kube-api-access-xcbf9" (OuterVolumeSpecName: "kube-api-access-xcbf9") pod "1631206f-7c4d-4d8c-83e4-0f79a8dc403a" (UID: "1631206f-7c4d-4d8c-83e4-0f79a8dc403a"). InnerVolumeSpecName "kube-api-access-xcbf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.339682 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcbf9\" (UniqueName: \"kubernetes.io/projected/1631206f-7c4d-4d8c-83e4-0f79a8dc403a-kube-api-access-xcbf9\") on node \"crc\" DevicePath \"\"" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.339723 4628 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1631206f-7c4d-4d8c-83e4-0f79a8dc403a-host\") on node \"crc\" DevicePath \"\"" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.921797 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1631206f-7c4d-4d8c-83e4-0f79a8dc403a" path="/var/lib/kubelet/pods/1631206f-7c4d-4d8c-83e4-0f79a8dc403a/volumes" Dec 11 06:36:49 crc kubenswrapper[4628]: I1211 06:36:49.922378 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5149c12a-fa98-4588-9fda-4c8471752df1" path="/var/lib/kubelet/pods/5149c12a-fa98-4588-9fda-4c8471752df1/volumes" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.078213 4628 scope.go:117] "RemoveContainer" containerID="a2b1e53ce5b142164803e1c7cca6fa0725ead78f89906a3e7abe963da38234ff" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.079194 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/crc-debug-ww4tm" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.415506 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lwvpk/crc-debug-llj49"] Dec 11 06:36:50 crc kubenswrapper[4628]: E1211 06:36:50.416191 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5149c12a-fa98-4588-9fda-4c8471752df1" containerName="registry-server" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.416214 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="5149c12a-fa98-4588-9fda-4c8471752df1" containerName="registry-server" Dec 11 06:36:50 crc kubenswrapper[4628]: E1211 06:36:50.416236 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5149c12a-fa98-4588-9fda-4c8471752df1" containerName="extract-content" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.416245 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="5149c12a-fa98-4588-9fda-4c8471752df1" containerName="extract-content" Dec 11 06:36:50 crc kubenswrapper[4628]: E1211 06:36:50.416256 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5149c12a-fa98-4588-9fda-4c8471752df1" containerName="extract-utilities" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.416262 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="5149c12a-fa98-4588-9fda-4c8471752df1" containerName="extract-utilities" Dec 11 06:36:50 crc kubenswrapper[4628]: E1211 06:36:50.416287 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1631206f-7c4d-4d8c-83e4-0f79a8dc403a" containerName="container-00" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.416292 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="1631206f-7c4d-4d8c-83e4-0f79a8dc403a" containerName="container-00" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.416489 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="1631206f-7c4d-4d8c-83e4-0f79a8dc403a" containerName="container-00" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.416501 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="5149c12a-fa98-4588-9fda-4c8471752df1" containerName="registry-server" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.417202 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/crc-debug-llj49" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.419963 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lwvpk"/"default-dockercfg-7hxds" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.570411 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd5e1f08-0c6c-413b-9cde-73380728f4bd-host\") pod \"crc-debug-llj49\" (UID: \"cd5e1f08-0c6c-413b-9cde-73380728f4bd\") " pod="openshift-must-gather-lwvpk/crc-debug-llj49" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.570642 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qkv8\" (UniqueName: \"kubernetes.io/projected/cd5e1f08-0c6c-413b-9cde-73380728f4bd-kube-api-access-7qkv8\") pod \"crc-debug-llj49\" (UID: \"cd5e1f08-0c6c-413b-9cde-73380728f4bd\") " pod="openshift-must-gather-lwvpk/crc-debug-llj49" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.671940 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd5e1f08-0c6c-413b-9cde-73380728f4bd-host\") pod \"crc-debug-llj49\" (UID: \"cd5e1f08-0c6c-413b-9cde-73380728f4bd\") " pod="openshift-must-gather-lwvpk/crc-debug-llj49" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.672032 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qkv8\" (UniqueName: \"kubernetes.io/projected/cd5e1f08-0c6c-413b-9cde-73380728f4bd-kube-api-access-7qkv8\") pod \"crc-debug-llj49\" (UID: \"cd5e1f08-0c6c-413b-9cde-73380728f4bd\") " pod="openshift-must-gather-lwvpk/crc-debug-llj49" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.672079 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd5e1f08-0c6c-413b-9cde-73380728f4bd-host\") pod \"crc-debug-llj49\" (UID: \"cd5e1f08-0c6c-413b-9cde-73380728f4bd\") " pod="openshift-must-gather-lwvpk/crc-debug-llj49" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.705731 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qkv8\" (UniqueName: \"kubernetes.io/projected/cd5e1f08-0c6c-413b-9cde-73380728f4bd-kube-api-access-7qkv8\") pod \"crc-debug-llj49\" (UID: \"cd5e1f08-0c6c-413b-9cde-73380728f4bd\") " pod="openshift-must-gather-lwvpk/crc-debug-llj49" Dec 11 06:36:50 crc kubenswrapper[4628]: I1211 06:36:50.732390 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/crc-debug-llj49" Dec 11 06:36:50 crc kubenswrapper[4628]: W1211 06:36:50.798966 4628 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd5e1f08_0c6c_413b_9cde_73380728f4bd.slice/crio-487b93f213db1322241d71988c08533001344aacda71b34945a87ef5ff08cf3c WatchSource:0}: Error finding container 487b93f213db1322241d71988c08533001344aacda71b34945a87ef5ff08cf3c: Status 404 returned error can't find the container with id 487b93f213db1322241d71988c08533001344aacda71b34945a87ef5ff08cf3c Dec 11 06:36:51 crc kubenswrapper[4628]: I1211 06:36:51.094838 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwvpk/crc-debug-llj49" event={"ID":"cd5e1f08-0c6c-413b-9cde-73380728f4bd","Type":"ContainerStarted","Data":"21f0228d74c8f547d4dfffe346e78f1a23bf95822ce20d803d557661861a6fdf"} Dec 11 06:36:51 crc kubenswrapper[4628]: I1211 06:36:51.095206 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwvpk/crc-debug-llj49" event={"ID":"cd5e1f08-0c6c-413b-9cde-73380728f4bd","Type":"ContainerStarted","Data":"487b93f213db1322241d71988c08533001344aacda71b34945a87ef5ff08cf3c"} Dec 11 06:36:51 crc kubenswrapper[4628]: I1211 06:36:51.115877 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lwvpk/crc-debug-llj49" podStartSLOduration=1.115859357 podStartE2EDuration="1.115859357s" podCreationTimestamp="2025-12-11 06:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-11 06:36:51.106780721 +0000 UTC m=+4913.524127429" watchObservedRunningTime="2025-12-11 06:36:51.115859357 +0000 UTC m=+4913.533206055" Dec 11 06:36:52 crc kubenswrapper[4628]: I1211 06:36:52.114022 4628 generic.go:334] "Generic (PLEG): container finished" podID="cd5e1f08-0c6c-413b-9cde-73380728f4bd" containerID="21f0228d74c8f547d4dfffe346e78f1a23bf95822ce20d803d557661861a6fdf" exitCode=0 Dec 11 06:36:52 crc kubenswrapper[4628]: I1211 06:36:52.114071 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwvpk/crc-debug-llj49" event={"ID":"cd5e1f08-0c6c-413b-9cde-73380728f4bd","Type":"ContainerDied","Data":"21f0228d74c8f547d4dfffe346e78f1a23bf95822ce20d803d557661861a6fdf"} Dec 11 06:36:53 crc kubenswrapper[4628]: I1211 06:36:53.222929 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/crc-debug-llj49" Dec 11 06:36:53 crc kubenswrapper[4628]: I1211 06:36:53.337930 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd5e1f08-0c6c-413b-9cde-73380728f4bd-host\") pod \"cd5e1f08-0c6c-413b-9cde-73380728f4bd\" (UID: \"cd5e1f08-0c6c-413b-9cde-73380728f4bd\") " Dec 11 06:36:53 crc kubenswrapper[4628]: I1211 06:36:53.338103 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qkv8\" (UniqueName: \"kubernetes.io/projected/cd5e1f08-0c6c-413b-9cde-73380728f4bd-kube-api-access-7qkv8\") pod \"cd5e1f08-0c6c-413b-9cde-73380728f4bd\" (UID: \"cd5e1f08-0c6c-413b-9cde-73380728f4bd\") " Dec 11 06:36:53 crc kubenswrapper[4628]: I1211 06:36:53.339702 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd5e1f08-0c6c-413b-9cde-73380728f4bd-host" (OuterVolumeSpecName: "host") pod "cd5e1f08-0c6c-413b-9cde-73380728f4bd" (UID: "cd5e1f08-0c6c-413b-9cde-73380728f4bd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 06:36:53 crc kubenswrapper[4628]: I1211 06:36:53.344933 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd5e1f08-0c6c-413b-9cde-73380728f4bd-kube-api-access-7qkv8" (OuterVolumeSpecName: "kube-api-access-7qkv8") pod "cd5e1f08-0c6c-413b-9cde-73380728f4bd" (UID: "cd5e1f08-0c6c-413b-9cde-73380728f4bd"). InnerVolumeSpecName "kube-api-access-7qkv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:36:53 crc kubenswrapper[4628]: I1211 06:36:53.370551 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lwvpk/crc-debug-llj49"] Dec 11 06:36:53 crc kubenswrapper[4628]: I1211 06:36:53.380215 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lwvpk/crc-debug-llj49"] Dec 11 06:36:53 crc kubenswrapper[4628]: I1211 06:36:53.440023 4628 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd5e1f08-0c6c-413b-9cde-73380728f4bd-host\") on node \"crc\" DevicePath \"\"" Dec 11 06:36:53 crc kubenswrapper[4628]: I1211 06:36:53.440056 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qkv8\" (UniqueName: \"kubernetes.io/projected/cd5e1f08-0c6c-413b-9cde-73380728f4bd-kube-api-access-7qkv8\") on node \"crc\" DevicePath \"\"" Dec 11 06:36:53 crc kubenswrapper[4628]: I1211 06:36:53.902317 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd5e1f08-0c6c-413b-9cde-73380728f4bd" path="/var/lib/kubelet/pods/cd5e1f08-0c6c-413b-9cde-73380728f4bd/volumes" Dec 11 06:36:54 crc kubenswrapper[4628]: I1211 06:36:54.134806 4628 scope.go:117] "RemoveContainer" containerID="21f0228d74c8f547d4dfffe346e78f1a23bf95822ce20d803d557661861a6fdf" Dec 11 06:36:54 crc kubenswrapper[4628]: I1211 06:36:54.134984 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/crc-debug-llj49" Dec 11 06:36:54 crc kubenswrapper[4628]: I1211 06:36:54.842985 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lwvpk/crc-debug-mtgkh"] Dec 11 06:36:54 crc kubenswrapper[4628]: E1211 06:36:54.843515 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5e1f08-0c6c-413b-9cde-73380728f4bd" containerName="container-00" Dec 11 06:36:54 crc kubenswrapper[4628]: I1211 06:36:54.843533 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5e1f08-0c6c-413b-9cde-73380728f4bd" containerName="container-00" Dec 11 06:36:54 crc kubenswrapper[4628]: I1211 06:36:54.843790 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd5e1f08-0c6c-413b-9cde-73380728f4bd" containerName="container-00" Dec 11 06:36:54 crc kubenswrapper[4628]: I1211 06:36:54.844726 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/crc-debug-mtgkh" Dec 11 06:36:54 crc kubenswrapper[4628]: I1211 06:36:54.850259 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lwvpk"/"default-dockercfg-7hxds" Dec 11 06:36:54 crc kubenswrapper[4628]: I1211 06:36:54.967353 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db801331-d5d4-4d59-9ad5-2137f573d698-host\") pod \"crc-debug-mtgkh\" (UID: \"db801331-d5d4-4d59-9ad5-2137f573d698\") " pod="openshift-must-gather-lwvpk/crc-debug-mtgkh" Dec 11 06:36:54 crc kubenswrapper[4628]: I1211 06:36:54.967457 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p9q8\" (UniqueName: \"kubernetes.io/projected/db801331-d5d4-4d59-9ad5-2137f573d698-kube-api-access-5p9q8\") pod \"crc-debug-mtgkh\" (UID: \"db801331-d5d4-4d59-9ad5-2137f573d698\") " pod="openshift-must-gather-lwvpk/crc-debug-mtgkh" Dec 11 06:36:55 crc kubenswrapper[4628]: I1211 06:36:55.069529 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db801331-d5d4-4d59-9ad5-2137f573d698-host\") pod \"crc-debug-mtgkh\" (UID: \"db801331-d5d4-4d59-9ad5-2137f573d698\") " pod="openshift-must-gather-lwvpk/crc-debug-mtgkh" Dec 11 06:36:55 crc kubenswrapper[4628]: I1211 06:36:55.069749 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p9q8\" (UniqueName: \"kubernetes.io/projected/db801331-d5d4-4d59-9ad5-2137f573d698-kube-api-access-5p9q8\") pod \"crc-debug-mtgkh\" (UID: \"db801331-d5d4-4d59-9ad5-2137f573d698\") " pod="openshift-must-gather-lwvpk/crc-debug-mtgkh" Dec 11 06:36:55 crc kubenswrapper[4628]: I1211 06:36:55.071158 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db801331-d5d4-4d59-9ad5-2137f573d698-host\") pod \"crc-debug-mtgkh\" (UID: \"db801331-d5d4-4d59-9ad5-2137f573d698\") " pod="openshift-must-gather-lwvpk/crc-debug-mtgkh" Dec 11 06:36:55 crc kubenswrapper[4628]: I1211 06:36:55.108718 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p9q8\" (UniqueName: \"kubernetes.io/projected/db801331-d5d4-4d59-9ad5-2137f573d698-kube-api-access-5p9q8\") pod \"crc-debug-mtgkh\" (UID: \"db801331-d5d4-4d59-9ad5-2137f573d698\") " pod="openshift-must-gather-lwvpk/crc-debug-mtgkh" Dec 11 06:36:55 crc kubenswrapper[4628]: I1211 06:36:55.172660 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/crc-debug-mtgkh" Dec 11 06:36:56 crc kubenswrapper[4628]: I1211 06:36:56.152935 4628 generic.go:334] "Generic (PLEG): container finished" podID="db801331-d5d4-4d59-9ad5-2137f573d698" containerID="054020678c1ec9f7c1e45861178f5fdf6ea07382abf4b8745bfa0c260e9241f6" exitCode=0 Dec 11 06:36:56 crc kubenswrapper[4628]: I1211 06:36:56.153040 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwvpk/crc-debug-mtgkh" event={"ID":"db801331-d5d4-4d59-9ad5-2137f573d698","Type":"ContainerDied","Data":"054020678c1ec9f7c1e45861178f5fdf6ea07382abf4b8745bfa0c260e9241f6"} Dec 11 06:36:56 crc kubenswrapper[4628]: I1211 06:36:56.156446 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwvpk/crc-debug-mtgkh" event={"ID":"db801331-d5d4-4d59-9ad5-2137f573d698","Type":"ContainerStarted","Data":"9b2f38c99b6f1d9b603dcb2ac4b9c795083c30c22ab79fb7181ae3ec57c4c548"} Dec 11 06:36:56 crc kubenswrapper[4628]: I1211 06:36:56.191970 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lwvpk/crc-debug-mtgkh"] Dec 11 06:36:56 crc kubenswrapper[4628]: I1211 06:36:56.205036 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lwvpk/crc-debug-mtgkh"] Dec 11 06:36:57 crc kubenswrapper[4628]: I1211 06:36:57.251269 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/crc-debug-mtgkh" Dec 11 06:36:57 crc kubenswrapper[4628]: I1211 06:36:57.408042 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db801331-d5d4-4d59-9ad5-2137f573d698-host\") pod \"db801331-d5d4-4d59-9ad5-2137f573d698\" (UID: \"db801331-d5d4-4d59-9ad5-2137f573d698\") " Dec 11 06:36:57 crc kubenswrapper[4628]: I1211 06:36:57.408158 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p9q8\" (UniqueName: \"kubernetes.io/projected/db801331-d5d4-4d59-9ad5-2137f573d698-kube-api-access-5p9q8\") pod \"db801331-d5d4-4d59-9ad5-2137f573d698\" (UID: \"db801331-d5d4-4d59-9ad5-2137f573d698\") " Dec 11 06:36:57 crc kubenswrapper[4628]: I1211 06:36:57.408380 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db801331-d5d4-4d59-9ad5-2137f573d698-host" (OuterVolumeSpecName: "host") pod "db801331-d5d4-4d59-9ad5-2137f573d698" (UID: "db801331-d5d4-4d59-9ad5-2137f573d698"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 11 06:36:57 crc kubenswrapper[4628]: I1211 06:36:57.408577 4628 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db801331-d5d4-4d59-9ad5-2137f573d698-host\") on node \"crc\" DevicePath \"\"" Dec 11 06:36:57 crc kubenswrapper[4628]: I1211 06:36:57.415127 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db801331-d5d4-4d59-9ad5-2137f573d698-kube-api-access-5p9q8" (OuterVolumeSpecName: "kube-api-access-5p9q8") pod "db801331-d5d4-4d59-9ad5-2137f573d698" (UID: "db801331-d5d4-4d59-9ad5-2137f573d698"). InnerVolumeSpecName "kube-api-access-5p9q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:36:57 crc kubenswrapper[4628]: I1211 06:36:57.510814 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p9q8\" (UniqueName: \"kubernetes.io/projected/db801331-d5d4-4d59-9ad5-2137f573d698-kube-api-access-5p9q8\") on node \"crc\" DevicePath \"\"" Dec 11 06:36:57 crc kubenswrapper[4628]: I1211 06:36:57.929151 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db801331-d5d4-4d59-9ad5-2137f573d698" path="/var/lib/kubelet/pods/db801331-d5d4-4d59-9ad5-2137f573d698/volumes" Dec 11 06:36:58 crc kubenswrapper[4628]: I1211 06:36:58.172049 4628 scope.go:117] "RemoveContainer" containerID="054020678c1ec9f7c1e45861178f5fdf6ea07382abf4b8745bfa0c260e9241f6" Dec 11 06:36:58 crc kubenswrapper[4628]: I1211 06:36:58.172505 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/crc-debug-mtgkh" Dec 11 06:37:01 crc kubenswrapper[4628]: I1211 06:37:01.427157 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:37:01 crc kubenswrapper[4628]: I1211 06:37:01.427622 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:37:31 crc kubenswrapper[4628]: I1211 06:37:31.427421 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:37:31 crc kubenswrapper[4628]: I1211 06:37:31.428010 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:37:40 crc kubenswrapper[4628]: I1211 06:37:40.484586 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-f64c5f7b6-ctrn9_b3b9644d-2578-4628-ac4c-28d16e0657e0/barbican-api/0.log" Dec 11 06:37:40 crc kubenswrapper[4628]: I1211 06:37:40.678885 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-f64c5f7b6-ctrn9_b3b9644d-2578-4628-ac4c-28d16e0657e0/barbican-api-log/0.log" Dec 11 06:37:40 crc kubenswrapper[4628]: I1211 06:37:40.764419 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-b5c776c64-wmwpw_391ad8e5-9c8c-463c-8d25-4d74e3f8cf94/barbican-keystone-listener/0.log" Dec 11 06:37:40 crc kubenswrapper[4628]: I1211 06:37:40.832650 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-b5c776c64-wmwpw_391ad8e5-9c8c-463c-8d25-4d74e3f8cf94/barbican-keystone-listener-log/0.log" Dec 11 06:37:40 crc kubenswrapper[4628]: I1211 06:37:40.988855 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-759d9b665f-6pnnw_b223e08d-3dfd-4c2d-b720-fe142822a27c/barbican-worker/0.log" Dec 11 06:37:41 crc kubenswrapper[4628]: I1211 06:37:41.054998 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-759d9b665f-6pnnw_b223e08d-3dfd-4c2d-b720-fe142822a27c/barbican-worker-log/0.log" Dec 11 06:37:41 crc kubenswrapper[4628]: I1211 06:37:41.267531 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-7xj98_376d3aeb-b569-4e4e-847a-762ed8f12b35/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:41 crc kubenswrapper[4628]: I1211 06:37:41.371166 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0091ba0-9c70-41dd-8f21-68968a10a308/ceilometer-central-agent/0.log" Dec 11 06:37:41 crc kubenswrapper[4628]: I1211 06:37:41.436422 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0091ba0-9c70-41dd-8f21-68968a10a308/ceilometer-notification-agent/0.log" Dec 11 06:37:41 crc kubenswrapper[4628]: I1211 06:37:41.457939 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0091ba0-9c70-41dd-8f21-68968a10a308/sg-core/0.log" Dec 11 06:37:41 crc kubenswrapper[4628]: I1211 06:37:41.519361 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e0091ba0-9c70-41dd-8f21-68968a10a308/proxy-httpd/0.log" Dec 11 06:37:41 crc kubenswrapper[4628]: I1211 06:37:41.744128 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_12485bc3-6a23-4772-8c00-2148c65fe10d/cinder-api-log/0.log" Dec 11 06:37:41 crc kubenswrapper[4628]: I1211 06:37:41.847954 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_12485bc3-6a23-4772-8c00-2148c65fe10d/cinder-api/0.log" Dec 11 06:37:41 crc kubenswrapper[4628]: I1211 06:37:41.922959 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_df6f8836-27ef-4cbd-aed1-0949861716db/cinder-scheduler/0.log" Dec 11 06:37:42 crc kubenswrapper[4628]: I1211 06:37:42.040073 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_df6f8836-27ef-4cbd-aed1-0949861716db/probe/0.log" Dec 11 06:37:42 crc kubenswrapper[4628]: I1211 06:37:42.126523 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2gvgg_74ebd783-bcc7-4521-a9f2-450201f04c18/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:42 crc kubenswrapper[4628]: I1211 06:37:42.306078 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7hj9d_6838fcd4-0c2b-4c92-880c-eb9029af8a00/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:42 crc kubenswrapper[4628]: I1211 06:37:42.346484 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-gz9jh_1842054b-c613-4c76-9cb8-3738bc44a946/init/0.log" Dec 11 06:37:42 crc kubenswrapper[4628]: I1211 06:37:42.520992 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-gz9jh_1842054b-c613-4c76-9cb8-3738bc44a946/init/0.log" Dec 11 06:37:42 crc kubenswrapper[4628]: I1211 06:37:42.619111 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pnjmd_4416beb7-730c-4898-b603-a123279eb238/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:42 crc kubenswrapper[4628]: I1211 06:37:42.815793 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-54ffdb7d8c-gz9jh_1842054b-c613-4c76-9cb8-3738bc44a946/dnsmasq-dns/0.log" Dec 11 06:37:42 crc kubenswrapper[4628]: I1211 06:37:42.868997 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_634cddf9-405e-42ee-a106-3c99b8d921d1/glance-httpd/0.log" Dec 11 06:37:42 crc kubenswrapper[4628]: I1211 06:37:42.930724 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_634cddf9-405e-42ee-a106-3c99b8d921d1/glance-log/0.log" Dec 11 06:37:43 crc kubenswrapper[4628]: I1211 06:37:43.153397 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_83368b19-5867-444e-a7ea-55683f0e6b26/glance-httpd/0.log" Dec 11 06:37:43 crc kubenswrapper[4628]: I1211 06:37:43.228950 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_83368b19-5867-444e-a7ea-55683f0e6b26/glance-log/0.log" Dec 11 06:37:43 crc kubenswrapper[4628]: I1211 06:37:43.600556 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7989644c86-scmh4_51e02694-e634-4a3b-8406-3b3b72007c2b/horizon/0.log" Dec 11 06:37:43 crc kubenswrapper[4628]: I1211 06:37:43.643365 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-h4t9v_cbe8ae7a-0268-477b-a232-fb89a86e6c30/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:44 crc kubenswrapper[4628]: I1211 06:37:44.033296 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-jhbk5_7802f047-ef49-4339-8783-fa927f841103/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:44 crc kubenswrapper[4628]: I1211 06:37:44.043619 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7989644c86-scmh4_51e02694-e634-4a3b-8406-3b3b72007c2b/horizon-log/0.log" Dec 11 06:37:44 crc kubenswrapper[4628]: I1211 06:37:44.468350 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29423881-sxmlb_c82db411-744d-4cc8-8ae5-3031c70241d4/keystone-cron/0.log" Dec 11 06:37:44 crc kubenswrapper[4628]: I1211 06:37:44.775666 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6dd99782-66a8-47e8-a4cf-d5f2805655dc/kube-state-metrics/0.log" Dec 11 06:37:44 crc kubenswrapper[4628]: I1211 06:37:44.810792 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55f69568c9-2p2zq_3b27f97f-7392-47aa-8551-badeb28bce06/keystone-api/0.log" Dec 11 06:37:45 crc kubenswrapper[4628]: I1211 06:37:45.001000 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-m9wqw_10745043-8954-4864-9b9b-d3b2e8614e36/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:45 crc kubenswrapper[4628]: I1211 06:37:45.572574 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5gcn7_c4cbdee7-e2f8-451b-bdeb-14f76b85b6a6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:45 crc kubenswrapper[4628]: I1211 06:37:45.817003 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67df497849-l9zzv_9d505505-13f5-4899-b1a5-7f739066e73c/neutron-httpd/0.log" Dec 11 06:37:45 crc kubenswrapper[4628]: I1211 06:37:45.982406 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67df497849-l9zzv_9d505505-13f5-4899-b1a5-7f739066e73c/neutron-api/0.log" Dec 11 06:37:46 crc kubenswrapper[4628]: I1211 06:37:46.593761 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ccc5f4b2-0364-42a0-abba-16c0e471f5c6/nova-cell0-conductor-conductor/0.log" Dec 11 06:37:47 crc kubenswrapper[4628]: I1211 06:37:47.181609 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_dfa4c6b3-3b46-4383-8813-38a038d8e0da/nova-api-log/0.log" Dec 11 06:37:47 crc kubenswrapper[4628]: I1211 06:37:47.204599 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_48ebe2ed-1819-4bd2-9f26-e8f392645684/nova-cell1-conductor-conductor/0.log" Dec 11 06:37:47 crc kubenswrapper[4628]: I1211 06:37:47.648676 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-fz78t_1b0b9e64-e4c3-4250-ae8d-319461717fcd/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:47 crc kubenswrapper[4628]: I1211 06:37:47.712921 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_dfa4c6b3-3b46-4383-8813-38a038d8e0da/nova-api-api/0.log" Dec 11 06:37:47 crc kubenswrapper[4628]: I1211 06:37:47.720241 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d1b29ff2-7a02-42ed-9dde-d998ad2e693f/nova-cell1-novncproxy-novncproxy/0.log" Dec 11 06:37:48 crc kubenswrapper[4628]: I1211 06:37:48.005876 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a14c2329-c910-45b3-a28c-258f07a31c5f/nova-metadata-log/0.log" Dec 11 06:37:48 crc kubenswrapper[4628]: I1211 06:37:48.496726 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a14ea6c9-f372-463b-8485-a3411412cbe9/memcached/0.log" Dec 11 06:37:48 crc kubenswrapper[4628]: I1211 06:37:48.633553 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f5879bd1-c58f-4c7a-8158-8be2bd632bf8/mysql-bootstrap/0.log" Dec 11 06:37:49 crc kubenswrapper[4628]: I1211 06:37:49.033954 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f5879bd1-c58f-4c7a-8158-8be2bd632bf8/galera/0.log" Dec 11 06:37:49 crc kubenswrapper[4628]: I1211 06:37:49.038758 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f5879bd1-c58f-4c7a-8158-8be2bd632bf8/mysql-bootstrap/0.log" Dec 11 06:37:49 crc kubenswrapper[4628]: I1211 06:37:49.165799 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_997489dd-97c9-4359-9920-8bdb512f708b/nova-scheduler-scheduler/0.log" Dec 11 06:37:49 crc kubenswrapper[4628]: I1211 06:37:49.279000 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e4498a18-7449-45b3-9061-d3ffbfa4be5b/mysql-bootstrap/0.log" Dec 11 06:37:49 crc kubenswrapper[4628]: I1211 06:37:49.559290 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e4498a18-7449-45b3-9061-d3ffbfa4be5b/mysql-bootstrap/0.log" Dec 11 06:37:49 crc kubenswrapper[4628]: I1211 06:37:49.561374 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e4498a18-7449-45b3-9061-d3ffbfa4be5b/galera/0.log" Dec 11 06:37:49 crc kubenswrapper[4628]: I1211 06:37:49.600639 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a14c2329-c910-45b3-a28c-258f07a31c5f/nova-metadata-metadata/0.log" Dec 11 06:37:49 crc kubenswrapper[4628]: I1211 06:37:49.609423 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_75983d64-ba11-4ef7-a433-34863bd80b58/openstackclient/0.log" Dec 11 06:37:49 crc kubenswrapper[4628]: I1211 06:37:49.796959 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-szj8g_c4361dca-0563-4576-a32a-2f03e4f399a0/openstack-network-exporter/0.log" Dec 11 06:37:49 crc kubenswrapper[4628]: I1211 06:37:49.849762 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gncbg_7c72a5ae-bbee-41cd-bb23-b9feb77f594d/ovsdb-server-init/0.log" Dec 11 06:37:50 crc kubenswrapper[4628]: I1211 06:37:50.048657 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gncbg_7c72a5ae-bbee-41cd-bb23-b9feb77f594d/ovsdb-server/0.log" Dec 11 06:37:50 crc kubenswrapper[4628]: I1211 06:37:50.076831 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gncbg_7c72a5ae-bbee-41cd-bb23-b9feb77f594d/ovs-vswitchd/0.log" Dec 11 06:37:50 crc kubenswrapper[4628]: I1211 06:37:50.124394 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gncbg_7c72a5ae-bbee-41cd-bb23-b9feb77f594d/ovsdb-server-init/0.log" Dec 11 06:37:50 crc kubenswrapper[4628]: I1211 06:37:50.132397 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qz7fr_d0885fe4-936a-4a13-b4e5-4aeee593c242/ovn-controller/0.log" Dec 11 06:37:50 crc kubenswrapper[4628]: I1211 06:37:50.327864 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-drhds_5ab6a157-55db-4fda-8066-c9fee33d98b4/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:50 crc kubenswrapper[4628]: I1211 06:37:50.418051 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0ba5be80-485c-4b8b-8e1d-3326db7cc5a0/ovn-northd/0.log" Dec 11 06:37:50 crc kubenswrapper[4628]: I1211 06:37:50.453871 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0ba5be80-485c-4b8b-8e1d-3326db7cc5a0/openstack-network-exporter/0.log" Dec 11 06:37:50 crc kubenswrapper[4628]: I1211 06:37:50.643341 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b16de833-9dc8-4e72-92b8-9374c7ab50bf/ovsdbserver-nb/0.log" Dec 11 06:37:50 crc kubenswrapper[4628]: I1211 06:37:50.704125 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b16de833-9dc8-4e72-92b8-9374c7ab50bf/openstack-network-exporter/0.log" Dec 11 06:37:50 crc kubenswrapper[4628]: I1211 06:37:50.704307 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b45a8a8a-00cb-482a-bfc5-149e693949c1/openstack-network-exporter/0.log" Dec 11 06:37:50 crc kubenswrapper[4628]: I1211 06:37:50.867552 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b45a8a8a-00cb-482a-bfc5-149e693949c1/ovsdbserver-sb/0.log" Dec 11 06:37:51 crc kubenswrapper[4628]: I1211 06:37:51.138362 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-589776fd4-wpbmd_bf66d5d0-5466-4c43-ab27-23c603bd90f7/placement-api/0.log" Dec 11 06:37:51 crc kubenswrapper[4628]: I1211 06:37:51.196710 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-589776fd4-wpbmd_bf66d5d0-5466-4c43-ab27-23c603bd90f7/placement-log/0.log" Dec 11 06:37:51 crc kubenswrapper[4628]: I1211 06:37:51.245727 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_38ba9ced-55a9-40ad-8581-45f8d87da5ef/setup-container/0.log" Dec 11 06:37:51 crc kubenswrapper[4628]: I1211 06:37:51.328665 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_38ba9ced-55a9-40ad-8581-45f8d87da5ef/setup-container/0.log" Dec 11 06:37:51 crc kubenswrapper[4628]: I1211 06:37:51.364228 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_38ba9ced-55a9-40ad-8581-45f8d87da5ef/rabbitmq/0.log" Dec 11 06:37:51 crc kubenswrapper[4628]: I1211 06:37:51.440243 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3c89e316-b7b8-4740-aa49-0c21052a51de/setup-container/0.log" Dec 11 06:37:51 crc kubenswrapper[4628]: I1211 06:37:51.640811 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3c89e316-b7b8-4740-aa49-0c21052a51de/setup-container/0.log" Dec 11 06:37:51 crc kubenswrapper[4628]: I1211 06:37:51.658105 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-x6ghn_0e07dc05-985f-429b-8c55-221b86fb63be/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:51 crc kubenswrapper[4628]: I1211 06:37:51.676175 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_3c89e316-b7b8-4740-aa49-0c21052a51de/rabbitmq/0.log" Dec 11 06:37:51 crc kubenswrapper[4628]: I1211 06:37:51.827352 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-7cjnm_cf8e2426-3f6e-4291-b9ea-77b91670d471/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:51 crc kubenswrapper[4628]: I1211 06:37:51.896322 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-wsw55_67fc31e9-87aa-48c9-9888-52a10d0858dd/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.021954 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-qjbxj_381944d6-a058-41f8-a452-82d1933510e3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.120382 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-mfrfd_179d12ab-f93f-4ce5-a674-deed794d48f0/ssh-known-hosts-edpm-deployment/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.286140 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7588f48d9f-5vkfm_b87045d6-b3bc-468e-8121-1023f3f30de0/proxy-server/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.413852 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7588f48d9f-5vkfm_b87045d6-b3bc-468e-8121-1023f3f30de0/proxy-httpd/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.446272 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-t4rhc_5e9e66a2-bb38-4bab-b6a7-b7e397f87aa1/swift-ring-rebalance/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.619384 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/account-auditor/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.626285 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/account-replicator/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.656352 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/account-reaper/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.693463 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/account-server/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.728210 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/container-auditor/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.822237 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/container-updater/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.831510 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/container-server/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.890301 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/container-replicator/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.942644 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/object-auditor/0.log" Dec 11 06:37:52 crc kubenswrapper[4628]: I1211 06:37:52.984576 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/object-expirer/0.log" Dec 11 06:37:53 crc kubenswrapper[4628]: I1211 06:37:53.057768 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/object-replicator/0.log" Dec 11 06:37:53 crc kubenswrapper[4628]: I1211 06:37:53.090691 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/object-server/0.log" Dec 11 06:37:53 crc kubenswrapper[4628]: I1211 06:37:53.154106 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/object-updater/0.log" Dec 11 06:37:53 crc kubenswrapper[4628]: I1211 06:37:53.166775 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/swift-recon-cron/0.log" Dec 11 06:37:53 crc kubenswrapper[4628]: I1211 06:37:53.206716 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612f2afd-9958-4367-a8c0-13066a05cd11/rsync/0.log" Dec 11 06:37:53 crc kubenswrapper[4628]: I1211 06:37:53.430778 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-f28db_70e52eb8-3a47-4192-9d87-3178a99becfe/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:37:53 crc kubenswrapper[4628]: I1211 06:37:53.513274 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_bdf75bdb-5535-4134-b9aa-f094e9e220fc/tempest-tests-tempest-tests-runner/0.log" Dec 11 06:37:53 crc kubenswrapper[4628]: I1211 06:37:53.541403 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_4072e36d-49c0-40a9-93d2-5700ef264f8b/test-operator-logs-container/0.log" Dec 11 06:37:53 crc kubenswrapper[4628]: I1211 06:37:53.722547 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-f2cpl_2f17b7ec-7ef4-4e90-85c4-a2b0296e58f2/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 11 06:38:01 crc kubenswrapper[4628]: I1211 06:38:01.426781 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:38:01 crc kubenswrapper[4628]: I1211 06:38:01.427338 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:38:01 crc kubenswrapper[4628]: I1211 06:38:01.427381 4628 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" Dec 11 06:38:01 crc kubenswrapper[4628]: I1211 06:38:01.428100 4628 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16"} pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 11 06:38:01 crc kubenswrapper[4628]: I1211 06:38:01.428151 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" containerID="cri-o://5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" gracePeriod=600 Dec 11 06:38:01 crc kubenswrapper[4628]: E1211 06:38:01.552505 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:38:01 crc kubenswrapper[4628]: I1211 06:38:01.729563 4628 generic.go:334] "Generic (PLEG): container finished" podID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" exitCode=0 Dec 11 06:38:01 crc kubenswrapper[4628]: I1211 06:38:01.729601 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerDied","Data":"5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16"} Dec 11 06:38:01 crc kubenswrapper[4628]: I1211 06:38:01.729631 4628 scope.go:117] "RemoveContainer" containerID="88388dbd3453cd64d96e82a99d6a17e7eeefa79420b0d46578dd03a105614740" Dec 11 06:38:01 crc kubenswrapper[4628]: I1211 06:38:01.730230 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:38:01 crc kubenswrapper[4628]: E1211 06:38:01.730488 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:38:15 crc kubenswrapper[4628]: I1211 06:38:15.894091 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:38:15 crc kubenswrapper[4628]: E1211 06:38:15.894950 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:38:19 crc kubenswrapper[4628]: I1211 06:38:19.973422 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/util/0.log" Dec 11 06:38:20 crc kubenswrapper[4628]: I1211 06:38:20.487916 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/pull/0.log" Dec 11 06:38:20 crc kubenswrapper[4628]: I1211 06:38:20.509421 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/util/0.log" Dec 11 06:38:20 crc kubenswrapper[4628]: I1211 06:38:20.557415 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/pull/0.log" Dec 11 06:38:20 crc kubenswrapper[4628]: I1211 06:38:20.827437 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/pull/0.log" Dec 11 06:38:20 crc kubenswrapper[4628]: I1211 06:38:20.838609 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/util/0.log" Dec 11 06:38:20 crc kubenswrapper[4628]: I1211 06:38:20.871325 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5e8c687eeedf598fcd5623439e30cae040f0ed79f3b482714b3eb51547cknq7_be5c0815-ff74-4d42-b5fa-5c3291e5f71d/extract/0.log" Dec 11 06:38:21 crc kubenswrapper[4628]: I1211 06:38:21.037763 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-9d9wj_f7d58419-0988-4a35-800f-2298db8e6597/kube-rbac-proxy/0.log" Dec 11 06:38:21 crc kubenswrapper[4628]: I1211 06:38:21.118082 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-9d9wj_f7d58419-0988-4a35-800f-2298db8e6597/manager/0.log" Dec 11 06:38:21 crc kubenswrapper[4628]: I1211 06:38:21.149609 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-h5xhk_43de67af-1cf5-4412-833e-e95e2ffcc47b/kube-rbac-proxy/0.log" Dec 11 06:38:21 crc kubenswrapper[4628]: I1211 06:38:21.287150 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-h5xhk_43de67af-1cf5-4412-833e-e95e2ffcc47b/manager/0.log" Dec 11 06:38:21 crc kubenswrapper[4628]: I1211 06:38:21.364948 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-whlx7_2f46589d-ec5b-48e9-8f64-741a6a5b3e84/kube-rbac-proxy/0.log" Dec 11 06:38:21 crc kubenswrapper[4628]: I1211 06:38:21.378418 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-whlx7_2f46589d-ec5b-48e9-8f64-741a6a5b3e84/manager/0.log" Dec 11 06:38:21 crc kubenswrapper[4628]: I1211 06:38:21.537456 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-v2lxt_dbdd3dcf-94cf-4b1e-9918-5d8efbe60360/kube-rbac-proxy/0.log" Dec 11 06:38:21 crc kubenswrapper[4628]: I1211 06:38:21.603403 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-v2lxt_dbdd3dcf-94cf-4b1e-9918-5d8efbe60360/manager/0.log" Dec 11 06:38:21 crc kubenswrapper[4628]: I1211 06:38:21.782287 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-9bcfl_f041b1fa-37ae-46fc-b6b0-301da06c1ff7/kube-rbac-proxy/0.log" Dec 11 06:38:21 crc kubenswrapper[4628]: I1211 06:38:21.789536 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-9bcfl_f041b1fa-37ae-46fc-b6b0-301da06c1ff7/manager/0.log" Dec 11 06:38:21 crc kubenswrapper[4628]: I1211 06:38:21.865109 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-vcz8d_232e8d69-426a-4259-93ab-1ebb4fa89a17/kube-rbac-proxy/0.log" Dec 11 06:38:21 crc kubenswrapper[4628]: I1211 06:38:21.995986 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-vcz8d_232e8d69-426a-4259-93ab-1ebb4fa89a17/manager/0.log" Dec 11 06:38:22 crc kubenswrapper[4628]: I1211 06:38:22.105382 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-6ff94_ae8e31fb-df50-4c43-af56-9c01af34f181/kube-rbac-proxy/0.log" Dec 11 06:38:22 crc kubenswrapper[4628]: I1211 06:38:22.365079 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-x4p8r_c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2/kube-rbac-proxy/0.log" Dec 11 06:38:22 crc kubenswrapper[4628]: I1211 06:38:22.378919 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-6ff94_ae8e31fb-df50-4c43-af56-9c01af34f181/manager/0.log" Dec 11 06:38:22 crc kubenswrapper[4628]: I1211 06:38:22.428196 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-x4p8r_c8834adf-70c2-46a6-a5d7-bdb2ddfc91d2/manager/0.log" Dec 11 06:38:22 crc kubenswrapper[4628]: I1211 06:38:22.640390 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-z6dn7_d0e69cfa-5f08-4640-b9f8-b7c27ef8660f/kube-rbac-proxy/0.log" Dec 11 06:38:22 crc kubenswrapper[4628]: I1211 06:38:22.658976 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-z6dn7_d0e69cfa-5f08-4640-b9f8-b7c27ef8660f/manager/0.log" Dec 11 06:38:22 crc kubenswrapper[4628]: I1211 06:38:22.796735 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-bqcdn_c8063e93-9008-453c-805c-487456b5e0ac/kube-rbac-proxy/0.log" Dec 11 06:38:22 crc kubenswrapper[4628]: I1211 06:38:22.874829 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-bqcdn_c8063e93-9008-453c-805c-487456b5e0ac/manager/0.log" Dec 11 06:38:22 crc kubenswrapper[4628]: I1211 06:38:22.922166 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-jqz2j_88d0bbcc-5138-434d-811b-d8db056922cb/kube-rbac-proxy/0.log" Dec 11 06:38:23 crc kubenswrapper[4628]: I1211 06:38:23.048402 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-jqz2j_88d0bbcc-5138-434d-811b-d8db056922cb/manager/0.log" Dec 11 06:38:23 crc kubenswrapper[4628]: I1211 06:38:23.180648 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-cjb98_c2e9f8e4-3eda-4227-ad4a-8f8641f88612/kube-rbac-proxy/0.log" Dec 11 06:38:23 crc kubenswrapper[4628]: I1211 06:38:23.234314 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-cjb98_c2e9f8e4-3eda-4227-ad4a-8f8641f88612/manager/0.log" Dec 11 06:38:23 crc kubenswrapper[4628]: I1211 06:38:23.386591 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-w5xrs_d92dcd20-90f9-4499-bae5-f117cf41b4d5/kube-rbac-proxy/0.log" Dec 11 06:38:23 crc kubenswrapper[4628]: I1211 06:38:23.529087 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-w5xrs_d92dcd20-90f9-4499-bae5-f117cf41b4d5/manager/0.log" Dec 11 06:38:23 crc kubenswrapper[4628]: I1211 06:38:23.545852 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-vftnq_a7d3410e-df7b-4de8-aa0f-4c6de9e251e7/kube-rbac-proxy/0.log" Dec 11 06:38:23 crc kubenswrapper[4628]: I1211 06:38:23.611944 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-vftnq_a7d3410e-df7b-4de8-aa0f-4c6de9e251e7/manager/0.log" Dec 11 06:38:23 crc kubenswrapper[4628]: I1211 06:38:23.746295 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879f9w8m4_3112c087-1436-4f0a-8b0c-6000b07a0f77/kube-rbac-proxy/0.log" Dec 11 06:38:23 crc kubenswrapper[4628]: I1211 06:38:23.750199 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879f9w8m4_3112c087-1436-4f0a-8b0c-6000b07a0f77/manager/0.log" Dec 11 06:38:24 crc kubenswrapper[4628]: I1211 06:38:24.074191 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zp8gf_589fc89a-de3e-4916-81a4-5972e3bd2410/registry-server/0.log" Dec 11 06:38:24 crc kubenswrapper[4628]: I1211 06:38:24.229839 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7fb58bb479-g2k9b_91ff2419-7fdf-4656-8d3a-69295ad50387/operator/0.log" Dec 11 06:38:24 crc kubenswrapper[4628]: I1211 06:38:24.272902 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-nc2xx_bca7bee3-0202-48ba-b0e9-3353f6ab0938/kube-rbac-proxy/0.log" Dec 11 06:38:24 crc kubenswrapper[4628]: I1211 06:38:24.403483 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-nc2xx_bca7bee3-0202-48ba-b0e9-3353f6ab0938/manager/0.log" Dec 11 06:38:24 crc kubenswrapper[4628]: I1211 06:38:24.644650 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-qxsdk_ee29a0b0-46f9-45f6-b356-dde79504d5cc/kube-rbac-proxy/0.log" Dec 11 06:38:24 crc kubenswrapper[4628]: I1211 06:38:24.682339 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-qxsdk_ee29a0b0-46f9-45f6-b356-dde79504d5cc/manager/0.log" Dec 11 06:38:24 crc kubenswrapper[4628]: I1211 06:38:24.897951 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-l85kc_938faeea-3048-4d4a-8f3d-e22b31c73f47/operator/0.log" Dec 11 06:38:24 crc kubenswrapper[4628]: I1211 06:38:24.984158 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-zvjrq_2b786de1-276f-470c-b60a-e93596dd9e47/kube-rbac-proxy/0.log" Dec 11 06:38:25 crc kubenswrapper[4628]: I1211 06:38:25.019230 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-zvjrq_2b786de1-276f-470c-b60a-e93596dd9e47/manager/0.log" Dec 11 06:38:25 crc kubenswrapper[4628]: I1211 06:38:25.058903 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7546d6d447-f9qwn_c0ac60c7-7b87-490a-9107-ad5de9864845/manager/0.log" Dec 11 06:38:25 crc kubenswrapper[4628]: I1211 06:38:25.226664 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-2q4b9_2b9ef50b-db17-4df4-a936-5a02a25f61d7/manager/0.log" Dec 11 06:38:25 crc kubenswrapper[4628]: I1211 06:38:25.229901 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-2q4b9_2b9ef50b-db17-4df4-a936-5a02a25f61d7/kube-rbac-proxy/0.log" Dec 11 06:38:25 crc kubenswrapper[4628]: I1211 06:38:25.336989 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-tnvqg_9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe/kube-rbac-proxy/0.log" Dec 11 06:38:25 crc kubenswrapper[4628]: I1211 06:38:25.457820 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-tnvqg_9dc3edb0-5d7f-4b4f-bea2-5f9c25b222fe/manager/0.log" Dec 11 06:38:25 crc kubenswrapper[4628]: I1211 06:38:25.482067 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-l2wf4_53a3113c-a3d2-42c8-8ab8-b26b448a728a/kube-rbac-proxy/0.log" Dec 11 06:38:25 crc kubenswrapper[4628]: I1211 06:38:25.494541 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-75944c9b7-l2wf4_53a3113c-a3d2-42c8-8ab8-b26b448a728a/manager/0.log" Dec 11 06:38:29 crc kubenswrapper[4628]: I1211 06:38:29.729524 4628 scope.go:117] "RemoveContainer" containerID="7f313b86ae33f5e3d1e6e4690d5a8491519aa4d9306e8fc6581bd592a9424e66" Dec 11 06:38:29 crc kubenswrapper[4628]: I1211 06:38:29.760671 4628 scope.go:117] "RemoveContainer" containerID="a92f1427d884780f6f492c76c043414eac48399708b1c541177af1ca6a376e7e" Dec 11 06:38:29 crc kubenswrapper[4628]: I1211 06:38:29.789373 4628 scope.go:117] "RemoveContainer" containerID="b41971b1c0d792a14c70b2281aa3cde93b79600f25c3203c2f6c3d99817c99dc" Dec 11 06:38:30 crc kubenswrapper[4628]: I1211 06:38:30.889111 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:38:30 crc kubenswrapper[4628]: E1211 06:38:30.889594 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:38:41 crc kubenswrapper[4628]: I1211 06:38:41.890121 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:38:41 crc kubenswrapper[4628]: E1211 06:38:41.890870 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:38:46 crc kubenswrapper[4628]: I1211 06:38:46.165384 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-78pgf_e22056a0-8001-488d-9dd7-9368d4a459e8/control-plane-machine-set-operator/0.log" Dec 11 06:38:46 crc kubenswrapper[4628]: I1211 06:38:46.359957 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lt4q5_575ba7ec-e024-40c7-be59-44a90232b4f2/machine-api-operator/0.log" Dec 11 06:38:46 crc kubenswrapper[4628]: I1211 06:38:46.360003 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lt4q5_575ba7ec-e024-40c7-be59-44a90232b4f2/kube-rbac-proxy/0.log" Dec 11 06:38:55 crc kubenswrapper[4628]: I1211 06:38:55.889316 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:38:55 crc kubenswrapper[4628]: E1211 06:38:55.891131 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:38:59 crc kubenswrapper[4628]: I1211 06:38:59.185801 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-gfsxf_6f9a6c48-1127-4e6c-bc88-133de5ba68e1/cert-manager-controller/0.log" Dec 11 06:38:59 crc kubenswrapper[4628]: I1211 06:38:59.365129 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-2fjb7_e08b2800-f237-4960-a939-b24a4f34b340/cert-manager-cainjector/0.log" Dec 11 06:38:59 crc kubenswrapper[4628]: I1211 06:38:59.419157 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-lf47f_2ab657c9-076c-4a39-9928-e92e8e276547/cert-manager-webhook/0.log" Dec 11 06:39:08 crc kubenswrapper[4628]: I1211 06:39:08.889175 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:39:08 crc kubenswrapper[4628]: E1211 06:39:08.890924 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:39:13 crc kubenswrapper[4628]: I1211 06:39:13.943811 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-mpq2p_6b801585-cb83-40f7-ab06-68951c4455c6/nmstate-console-plugin/0.log" Dec 11 06:39:14 crc kubenswrapper[4628]: I1211 06:39:14.057080 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-lg7wb_c2471a0c-a9c4-4323-9fb6-e67872046a7d/nmstate-handler/0.log" Dec 11 06:39:14 crc kubenswrapper[4628]: I1211 06:39:14.166666 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-tqdqm_3a77bda3-3bb4-402d-a4a9-df9e47e8ff39/kube-rbac-proxy/0.log" Dec 11 06:39:14 crc kubenswrapper[4628]: I1211 06:39:14.180639 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-tqdqm_3a77bda3-3bb4-402d-a4a9-df9e47e8ff39/nmstate-metrics/0.log" Dec 11 06:39:14 crc kubenswrapper[4628]: I1211 06:39:14.319660 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-589fs_ae1d3899-1bda-4ad7-8512-9582b6fe2c54/nmstate-operator/0.log" Dec 11 06:39:14 crc kubenswrapper[4628]: I1211 06:39:14.412988 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-c54rd_ddeebf78-e410-4562-a173-563b43b1b322/nmstate-webhook/0.log" Dec 11 06:39:22 crc kubenswrapper[4628]: I1211 06:39:22.889248 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:39:22 crc kubenswrapper[4628]: E1211 06:39:22.890048 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:39:31 crc kubenswrapper[4628]: I1211 06:39:31.695682 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-q4v8q_7fbbee42-6c6f-4b6f-a8f4-a7acb4686612/kube-rbac-proxy/0.log" Dec 11 06:39:31 crc kubenswrapper[4628]: I1211 06:39:31.843986 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-q4v8q_7fbbee42-6c6f-4b6f-a8f4-a7acb4686612/controller/0.log" Dec 11 06:39:31 crc kubenswrapper[4628]: I1211 06:39:31.983688 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-frr-files/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.162382 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-frr-files/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.196951 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-reloader/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.199965 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-reloader/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.200026 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-metrics/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.433063 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-metrics/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.433766 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-reloader/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.498496 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-frr-files/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.517812 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-metrics/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.686790 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-reloader/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.699290 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-frr-files/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.790871 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/cp-metrics/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.794135 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/controller/0.log" Dec 11 06:39:32 crc kubenswrapper[4628]: I1211 06:39:32.943143 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/frr-metrics/0.log" Dec 11 06:39:33 crc kubenswrapper[4628]: I1211 06:39:33.119985 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/kube-rbac-proxy/0.log" Dec 11 06:39:33 crc kubenswrapper[4628]: I1211 06:39:33.122721 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/kube-rbac-proxy-frr/0.log" Dec 11 06:39:33 crc kubenswrapper[4628]: I1211 06:39:33.273462 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/reloader/0.log" Dec 11 06:39:33 crc kubenswrapper[4628]: I1211 06:39:33.477533 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-5n5bs_abb0cb74-4b39-47f1-9a3a-cae28b6c32f6/frr-k8s-webhook-server/0.log" Dec 11 06:39:33 crc kubenswrapper[4628]: I1211 06:39:33.788576 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5f9c8b77b-f478p_8fb96fd9-ce49-43f8-baa0-c2fdecc79e0e/manager/0.log" Dec 11 06:39:34 crc kubenswrapper[4628]: I1211 06:39:34.029089 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-68cfc95d7c-4ssjf_f91be819-4cd2-4c94-98a2-108b05ab0a23/webhook-server/0.log" Dec 11 06:39:34 crc kubenswrapper[4628]: I1211 06:39:34.164557 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-97bqw_1c9bb5b6-577c-4b87-af2e-445ca30f9732/frr/0.log" Dec 11 06:39:34 crc kubenswrapper[4628]: I1211 06:39:34.168582 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v2km9_0cea36da-fd5c-416d-ab52-9500bc3fae0e/kube-rbac-proxy/0.log" Dec 11 06:39:34 crc kubenswrapper[4628]: I1211 06:39:34.531021 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-v2km9_0cea36da-fd5c-416d-ab52-9500bc3fae0e/speaker/0.log" Dec 11 06:39:35 crc kubenswrapper[4628]: I1211 06:39:35.889327 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:39:35 crc kubenswrapper[4628]: E1211 06:39:35.889559 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:39:48 crc kubenswrapper[4628]: I1211 06:39:48.475569 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/util/0.log" Dec 11 06:39:48 crc kubenswrapper[4628]: I1211 06:39:48.889117 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:39:48 crc kubenswrapper[4628]: E1211 06:39:48.889353 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:39:48 crc kubenswrapper[4628]: I1211 06:39:48.923730 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/pull/0.log" Dec 11 06:39:48 crc kubenswrapper[4628]: I1211 06:39:48.948048 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/util/0.log" Dec 11 06:39:49 crc kubenswrapper[4628]: I1211 06:39:49.003137 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/pull/0.log" Dec 11 06:39:49 crc kubenswrapper[4628]: I1211 06:39:49.137897 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/util/0.log" Dec 11 06:39:49 crc kubenswrapper[4628]: I1211 06:39:49.183036 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/pull/0.log" Dec 11 06:39:49 crc kubenswrapper[4628]: I1211 06:39:49.244584 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212frk5nh_c0e6a71a-6351-4860-a562-05df960a3f2c/extract/0.log" Dec 11 06:39:49 crc kubenswrapper[4628]: I1211 06:39:49.663490 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/util/0.log" Dec 11 06:39:49 crc kubenswrapper[4628]: I1211 06:39:49.864103 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/pull/0.log" Dec 11 06:39:49 crc kubenswrapper[4628]: I1211 06:39:49.887038 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/pull/0.log" Dec 11 06:39:49 crc kubenswrapper[4628]: I1211 06:39:49.928466 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/util/0.log" Dec 11 06:39:50 crc kubenswrapper[4628]: I1211 06:39:50.069773 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/pull/0.log" Dec 11 06:39:50 crc kubenswrapper[4628]: I1211 06:39:50.074786 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/util/0.log" Dec 11 06:39:50 crc kubenswrapper[4628]: I1211 06:39:50.147221 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83gdqgx_861570ba-65cf-4e91-90c0-c26b0c452c0e/extract/0.log" Dec 11 06:39:50 crc kubenswrapper[4628]: I1211 06:39:50.298271 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/extract-utilities/0.log" Dec 11 06:39:50 crc kubenswrapper[4628]: I1211 06:39:50.509534 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/extract-content/0.log" Dec 11 06:39:50 crc kubenswrapper[4628]: I1211 06:39:50.567466 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/extract-utilities/0.log" Dec 11 06:39:50 crc kubenswrapper[4628]: I1211 06:39:50.612506 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/extract-content/0.log" Dec 11 06:39:50 crc kubenswrapper[4628]: I1211 06:39:50.830868 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/extract-utilities/0.log" Dec 11 06:39:50 crc kubenswrapper[4628]: I1211 06:39:50.948323 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/extract-content/0.log" Dec 11 06:39:51 crc kubenswrapper[4628]: I1211 06:39:51.001787 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wg8m2_49c38217-7f74-447b-a8ab-b7bf727d90e5/registry-server/0.log" Dec 11 06:39:51 crc kubenswrapper[4628]: I1211 06:39:51.084991 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/extract-utilities/0.log" Dec 11 06:39:51 crc kubenswrapper[4628]: I1211 06:39:51.634703 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/extract-utilities/0.log" Dec 11 06:39:51 crc kubenswrapper[4628]: I1211 06:39:51.643595 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/extract-content/0.log" Dec 11 06:39:51 crc kubenswrapper[4628]: I1211 06:39:51.655623 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/extract-content/0.log" Dec 11 06:39:51 crc kubenswrapper[4628]: I1211 06:39:51.920176 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/extract-content/0.log" Dec 11 06:39:51 crc kubenswrapper[4628]: I1211 06:39:51.998988 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/extract-utilities/0.log" Dec 11 06:39:52 crc kubenswrapper[4628]: I1211 06:39:52.316597 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-slrfw_e99689f2-e449-4a63-aee4-2c22e629616a/registry-server/0.log" Dec 11 06:39:52 crc kubenswrapper[4628]: I1211 06:39:52.403489 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8wvwr_7683eae0-a7bd-46c4-867e-b15d65fc5e7e/marketplace-operator/0.log" Dec 11 06:39:52 crc kubenswrapper[4628]: I1211 06:39:52.438251 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/extract-utilities/0.log" Dec 11 06:39:52 crc kubenswrapper[4628]: I1211 06:39:52.591899 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/extract-content/0.log" Dec 11 06:39:52 crc kubenswrapper[4628]: I1211 06:39:52.666374 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/extract-utilities/0.log" Dec 11 06:39:52 crc kubenswrapper[4628]: I1211 06:39:52.707161 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/extract-content/0.log" Dec 11 06:39:52 crc kubenswrapper[4628]: I1211 06:39:52.841642 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/extract-content/0.log" Dec 11 06:39:53 crc kubenswrapper[4628]: I1211 06:39:53.063041 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/registry-server/0.log" Dec 11 06:39:53 crc kubenswrapper[4628]: I1211 06:39:53.076631 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8bd7x_a0ff0009-81bb-47da-aab8-5caeeec49061/extract-utilities/0.log" Dec 11 06:39:53 crc kubenswrapper[4628]: I1211 06:39:53.195249 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/extract-utilities/0.log" Dec 11 06:39:53 crc kubenswrapper[4628]: I1211 06:39:53.351798 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/extract-utilities/0.log" Dec 11 06:39:53 crc kubenswrapper[4628]: I1211 06:39:53.375139 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/extract-content/0.log" Dec 11 06:39:53 crc kubenswrapper[4628]: I1211 06:39:53.406551 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/extract-content/0.log" Dec 11 06:39:53 crc kubenswrapper[4628]: I1211 06:39:53.633352 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/extract-utilities/0.log" Dec 11 06:39:53 crc kubenswrapper[4628]: I1211 06:39:53.700118 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/extract-content/0.log" Dec 11 06:39:54 crc kubenswrapper[4628]: I1211 06:39:54.298218 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v9ml8_1faf62cf-c5ee-426d-afb5-25a16930ddbd/registry-server/0.log" Dec 11 06:40:01 crc kubenswrapper[4628]: I1211 06:40:01.890545 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:40:01 crc kubenswrapper[4628]: E1211 06:40:01.892028 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:40:14 crc kubenswrapper[4628]: I1211 06:40:14.890473 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:40:14 crc kubenswrapper[4628]: E1211 06:40:14.891324 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:40:18 crc kubenswrapper[4628]: E1211 06:40:18.279729 4628 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.18:41950->38.102.83.18:36143: read tcp 38.102.83.18:41950->38.102.83.18:36143: read: connection reset by peer Dec 11 06:40:28 crc kubenswrapper[4628]: I1211 06:40:28.890207 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:40:28 crc kubenswrapper[4628]: E1211 06:40:28.890945 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:40:42 crc kubenswrapper[4628]: I1211 06:40:42.889090 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:40:42 crc kubenswrapper[4628]: E1211 06:40:42.889932 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:40:55 crc kubenswrapper[4628]: I1211 06:40:55.891057 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:40:55 crc kubenswrapper[4628]: E1211 06:40:55.892202 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:41:06 crc kubenswrapper[4628]: I1211 06:41:06.889310 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:41:06 crc kubenswrapper[4628]: E1211 06:41:06.890120 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:41:17 crc kubenswrapper[4628]: I1211 06:41:17.896065 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:41:17 crc kubenswrapper[4628]: E1211 06:41:17.896742 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:41:30 crc kubenswrapper[4628]: I1211 06:41:30.890233 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:41:30 crc kubenswrapper[4628]: E1211 06:41:30.891318 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:41:42 crc kubenswrapper[4628]: I1211 06:41:42.891810 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:41:42 crc kubenswrapper[4628]: E1211 06:41:42.893194 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.476098 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q89z2"] Dec 11 06:41:47 crc kubenswrapper[4628]: E1211 06:41:47.478528 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db801331-d5d4-4d59-9ad5-2137f573d698" containerName="container-00" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.478554 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="db801331-d5d4-4d59-9ad5-2137f573d698" containerName="container-00" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.479142 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="db801331-d5d4-4d59-9ad5-2137f573d698" containerName="container-00" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.481323 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.500114 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q89z2"] Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.508114 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbd48\" (UniqueName: \"kubernetes.io/projected/debb0104-d7f5-466d-a2d2-6efb5ccefee1-kube-api-access-vbd48\") pod \"redhat-marketplace-q89z2\" (UID: \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\") " pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.508180 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debb0104-d7f5-466d-a2d2-6efb5ccefee1-utilities\") pod \"redhat-marketplace-q89z2\" (UID: \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\") " pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.508349 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debb0104-d7f5-466d-a2d2-6efb5ccefee1-catalog-content\") pod \"redhat-marketplace-q89z2\" (UID: \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\") " pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.609887 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debb0104-d7f5-466d-a2d2-6efb5ccefee1-catalog-content\") pod \"redhat-marketplace-q89z2\" (UID: \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\") " pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.610061 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbd48\" (UniqueName: \"kubernetes.io/projected/debb0104-d7f5-466d-a2d2-6efb5ccefee1-kube-api-access-vbd48\") pod \"redhat-marketplace-q89z2\" (UID: \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\") " pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.610087 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debb0104-d7f5-466d-a2d2-6efb5ccefee1-utilities\") pod \"redhat-marketplace-q89z2\" (UID: \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\") " pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.610469 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debb0104-d7f5-466d-a2d2-6efb5ccefee1-catalog-content\") pod \"redhat-marketplace-q89z2\" (UID: \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\") " pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.610919 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debb0104-d7f5-466d-a2d2-6efb5ccefee1-utilities\") pod \"redhat-marketplace-q89z2\" (UID: \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\") " pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.634830 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbd48\" (UniqueName: \"kubernetes.io/projected/debb0104-d7f5-466d-a2d2-6efb5ccefee1-kube-api-access-vbd48\") pod \"redhat-marketplace-q89z2\" (UID: \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\") " pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:47 crc kubenswrapper[4628]: I1211 06:41:47.813330 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:48 crc kubenswrapper[4628]: I1211 06:41:48.373437 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q89z2"] Dec 11 06:41:48 crc kubenswrapper[4628]: I1211 06:41:48.795557 4628 generic.go:334] "Generic (PLEG): container finished" podID="debb0104-d7f5-466d-a2d2-6efb5ccefee1" containerID="97068220ec33803a6e13364d4ae9d221da9606b6e43fb7bb83c1d96001c0595e" exitCode=0 Dec 11 06:41:48 crc kubenswrapper[4628]: I1211 06:41:48.795865 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q89z2" event={"ID":"debb0104-d7f5-466d-a2d2-6efb5ccefee1","Type":"ContainerDied","Data":"97068220ec33803a6e13364d4ae9d221da9606b6e43fb7bb83c1d96001c0595e"} Dec 11 06:41:48 crc kubenswrapper[4628]: I1211 06:41:48.795900 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q89z2" event={"ID":"debb0104-d7f5-466d-a2d2-6efb5ccefee1","Type":"ContainerStarted","Data":"bbfc8e20ac026025c2e55672e556c3eb527fe392ae20ba429f8c6d1d74df3370"} Dec 11 06:41:48 crc kubenswrapper[4628]: I1211 06:41:48.798267 4628 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 11 06:41:49 crc kubenswrapper[4628]: I1211 06:41:49.808950 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q89z2" event={"ID":"debb0104-d7f5-466d-a2d2-6efb5ccefee1","Type":"ContainerStarted","Data":"08dfe2034817efb97d07b1a2e2f99dbfdc8c777dfc90589d472677748290e5b7"} Dec 11 06:41:50 crc kubenswrapper[4628]: I1211 06:41:50.836100 4628 generic.go:334] "Generic (PLEG): container finished" podID="debb0104-d7f5-466d-a2d2-6efb5ccefee1" containerID="08dfe2034817efb97d07b1a2e2f99dbfdc8c777dfc90589d472677748290e5b7" exitCode=0 Dec 11 06:41:50 crc kubenswrapper[4628]: I1211 06:41:50.836314 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q89z2" event={"ID":"debb0104-d7f5-466d-a2d2-6efb5ccefee1","Type":"ContainerDied","Data":"08dfe2034817efb97d07b1a2e2f99dbfdc8c777dfc90589d472677748290e5b7"} Dec 11 06:41:51 crc kubenswrapper[4628]: I1211 06:41:51.849325 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q89z2" event={"ID":"debb0104-d7f5-466d-a2d2-6efb5ccefee1","Type":"ContainerStarted","Data":"c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c"} Dec 11 06:41:51 crc kubenswrapper[4628]: I1211 06:41:51.872314 4628 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q89z2" podStartSLOduration=2.307827384 podStartE2EDuration="4.872298708s" podCreationTimestamp="2025-12-11 06:41:47 +0000 UTC" firstStartedPulling="2025-12-11 06:41:48.79768242 +0000 UTC m=+5211.215029128" lastFinishedPulling="2025-12-11 06:41:51.362153754 +0000 UTC m=+5213.779500452" observedRunningTime="2025-12-11 06:41:51.871509137 +0000 UTC m=+5214.288855875" watchObservedRunningTime="2025-12-11 06:41:51.872298708 +0000 UTC m=+5214.289645406" Dec 11 06:41:53 crc kubenswrapper[4628]: I1211 06:41:53.889209 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:41:53 crc kubenswrapper[4628]: E1211 06:41:53.889719 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:41:57 crc kubenswrapper[4628]: I1211 06:41:57.814511 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:57 crc kubenswrapper[4628]: I1211 06:41:57.815179 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:57 crc kubenswrapper[4628]: I1211 06:41:57.863507 4628 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:58 crc kubenswrapper[4628]: I1211 06:41:58.013925 4628 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:41:59 crc kubenswrapper[4628]: I1211 06:41:59.234290 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q89z2"] Dec 11 06:41:59 crc kubenswrapper[4628]: I1211 06:41:59.959938 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q89z2" podUID="debb0104-d7f5-466d-a2d2-6efb5ccefee1" containerName="registry-server" containerID="cri-o://c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c" gracePeriod=2 Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.418861 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.569068 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbd48\" (UniqueName: \"kubernetes.io/projected/debb0104-d7f5-466d-a2d2-6efb5ccefee1-kube-api-access-vbd48\") pod \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\" (UID: \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\") " Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.569173 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debb0104-d7f5-466d-a2d2-6efb5ccefee1-catalog-content\") pod \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\" (UID: \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\") " Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.569887 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debb0104-d7f5-466d-a2d2-6efb5ccefee1-utilities\") pod \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\" (UID: \"debb0104-d7f5-466d-a2d2-6efb5ccefee1\") " Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.574254 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debb0104-d7f5-466d-a2d2-6efb5ccefee1-utilities" (OuterVolumeSpecName: "utilities") pod "debb0104-d7f5-466d-a2d2-6efb5ccefee1" (UID: "debb0104-d7f5-466d-a2d2-6efb5ccefee1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.580606 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/debb0104-d7f5-466d-a2d2-6efb5ccefee1-kube-api-access-vbd48" (OuterVolumeSpecName: "kube-api-access-vbd48") pod "debb0104-d7f5-466d-a2d2-6efb5ccefee1" (UID: "debb0104-d7f5-466d-a2d2-6efb5ccefee1"). InnerVolumeSpecName "kube-api-access-vbd48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.617668 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debb0104-d7f5-466d-a2d2-6efb5ccefee1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "debb0104-d7f5-466d-a2d2-6efb5ccefee1" (UID: "debb0104-d7f5-466d-a2d2-6efb5ccefee1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.672957 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbd48\" (UniqueName: \"kubernetes.io/projected/debb0104-d7f5-466d-a2d2-6efb5ccefee1-kube-api-access-vbd48\") on node \"crc\" DevicePath \"\"" Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.672992 4628 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/debb0104-d7f5-466d-a2d2-6efb5ccefee1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.673008 4628 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/debb0104-d7f5-466d-a2d2-6efb5ccefee1-utilities\") on node \"crc\" DevicePath \"\"" Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.976431 4628 generic.go:334] "Generic (PLEG): container finished" podID="debb0104-d7f5-466d-a2d2-6efb5ccefee1" containerID="c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c" exitCode=0 Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.976491 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q89z2" event={"ID":"debb0104-d7f5-466d-a2d2-6efb5ccefee1","Type":"ContainerDied","Data":"c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c"} Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.976514 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q89z2" Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.976542 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q89z2" event={"ID":"debb0104-d7f5-466d-a2d2-6efb5ccefee1","Type":"ContainerDied","Data":"bbfc8e20ac026025c2e55672e556c3eb527fe392ae20ba429f8c6d1d74df3370"} Dec 11 06:42:00 crc kubenswrapper[4628]: I1211 06:42:00.976572 4628 scope.go:117] "RemoveContainer" containerID="c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c" Dec 11 06:42:01 crc kubenswrapper[4628]: I1211 06:42:01.017450 4628 scope.go:117] "RemoveContainer" containerID="08dfe2034817efb97d07b1a2e2f99dbfdc8c777dfc90589d472677748290e5b7" Dec 11 06:42:01 crc kubenswrapper[4628]: I1211 06:42:01.025065 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q89z2"] Dec 11 06:42:01 crc kubenswrapper[4628]: I1211 06:42:01.039213 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q89z2"] Dec 11 06:42:01 crc kubenswrapper[4628]: I1211 06:42:01.050969 4628 scope.go:117] "RemoveContainer" containerID="97068220ec33803a6e13364d4ae9d221da9606b6e43fb7bb83c1d96001c0595e" Dec 11 06:42:01 crc kubenswrapper[4628]: I1211 06:42:01.091866 4628 scope.go:117] "RemoveContainer" containerID="c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c" Dec 11 06:42:01 crc kubenswrapper[4628]: E1211 06:42:01.094169 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c\": container with ID starting with c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c not found: ID does not exist" containerID="c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c" Dec 11 06:42:01 crc kubenswrapper[4628]: I1211 06:42:01.094217 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c"} err="failed to get container status \"c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c\": rpc error: code = NotFound desc = could not find container \"c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c\": container with ID starting with c554972ac94e1043bfe62e08d7aa51054d023238717a0b1f3984fec465650d0c not found: ID does not exist" Dec 11 06:42:01 crc kubenswrapper[4628]: I1211 06:42:01.094240 4628 scope.go:117] "RemoveContainer" containerID="08dfe2034817efb97d07b1a2e2f99dbfdc8c777dfc90589d472677748290e5b7" Dec 11 06:42:01 crc kubenswrapper[4628]: E1211 06:42:01.094526 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08dfe2034817efb97d07b1a2e2f99dbfdc8c777dfc90589d472677748290e5b7\": container with ID starting with 08dfe2034817efb97d07b1a2e2f99dbfdc8c777dfc90589d472677748290e5b7 not found: ID does not exist" containerID="08dfe2034817efb97d07b1a2e2f99dbfdc8c777dfc90589d472677748290e5b7" Dec 11 06:42:01 crc kubenswrapper[4628]: I1211 06:42:01.094556 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08dfe2034817efb97d07b1a2e2f99dbfdc8c777dfc90589d472677748290e5b7"} err="failed to get container status \"08dfe2034817efb97d07b1a2e2f99dbfdc8c777dfc90589d472677748290e5b7\": rpc error: code = NotFound desc = could not find container \"08dfe2034817efb97d07b1a2e2f99dbfdc8c777dfc90589d472677748290e5b7\": container with ID starting with 08dfe2034817efb97d07b1a2e2f99dbfdc8c777dfc90589d472677748290e5b7 not found: ID does not exist" Dec 11 06:42:01 crc kubenswrapper[4628]: I1211 06:42:01.094573 4628 scope.go:117] "RemoveContainer" containerID="97068220ec33803a6e13364d4ae9d221da9606b6e43fb7bb83c1d96001c0595e" Dec 11 06:42:01 crc kubenswrapper[4628]: E1211 06:42:01.094816 4628 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97068220ec33803a6e13364d4ae9d221da9606b6e43fb7bb83c1d96001c0595e\": container with ID starting with 97068220ec33803a6e13364d4ae9d221da9606b6e43fb7bb83c1d96001c0595e not found: ID does not exist" containerID="97068220ec33803a6e13364d4ae9d221da9606b6e43fb7bb83c1d96001c0595e" Dec 11 06:42:01 crc kubenswrapper[4628]: I1211 06:42:01.094871 4628 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97068220ec33803a6e13364d4ae9d221da9606b6e43fb7bb83c1d96001c0595e"} err="failed to get container status \"97068220ec33803a6e13364d4ae9d221da9606b6e43fb7bb83c1d96001c0595e\": rpc error: code = NotFound desc = could not find container \"97068220ec33803a6e13364d4ae9d221da9606b6e43fb7bb83c1d96001c0595e\": container with ID starting with 97068220ec33803a6e13364d4ae9d221da9606b6e43fb7bb83c1d96001c0595e not found: ID does not exist" Dec 11 06:42:01 crc kubenswrapper[4628]: I1211 06:42:01.902810 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="debb0104-d7f5-466d-a2d2-6efb5ccefee1" path="/var/lib/kubelet/pods/debb0104-d7f5-466d-a2d2-6efb5ccefee1/volumes" Dec 11 06:42:05 crc kubenswrapper[4628]: I1211 06:42:05.893597 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:42:05 crc kubenswrapper[4628]: E1211 06:42:05.894216 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:42:15 crc kubenswrapper[4628]: I1211 06:42:15.128073 4628 generic.go:334] "Generic (PLEG): container finished" podID="7e0799c7-7ac6-46c5-8478-4edc7de737d2" containerID="501554b130ab89d0b7ee4cc093ca6400532a7b899a4fc71fb3f1c92cf39c0e0b" exitCode=0 Dec 11 06:42:15 crc kubenswrapper[4628]: I1211 06:42:15.128311 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwvpk/must-gather-jfb8l" event={"ID":"7e0799c7-7ac6-46c5-8478-4edc7de737d2","Type":"ContainerDied","Data":"501554b130ab89d0b7ee4cc093ca6400532a7b899a4fc71fb3f1c92cf39c0e0b"} Dec 11 06:42:15 crc kubenswrapper[4628]: I1211 06:42:15.129344 4628 scope.go:117] "RemoveContainer" containerID="501554b130ab89d0b7ee4cc093ca6400532a7b899a4fc71fb3f1c92cf39c0e0b" Dec 11 06:42:15 crc kubenswrapper[4628]: I1211 06:42:15.621445 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lwvpk_must-gather-jfb8l_7e0799c7-7ac6-46c5-8478-4edc7de737d2/gather/0.log" Dec 11 06:42:18 crc kubenswrapper[4628]: I1211 06:42:18.890465 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:42:18 crc kubenswrapper[4628]: E1211 06:42:18.891485 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:42:29 crc kubenswrapper[4628]: I1211 06:42:29.455997 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lwvpk/must-gather-jfb8l"] Dec 11 06:42:29 crc kubenswrapper[4628]: I1211 06:42:29.456873 4628 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lwvpk/must-gather-jfb8l" podUID="7e0799c7-7ac6-46c5-8478-4edc7de737d2" containerName="copy" containerID="cri-o://2fcb3b907ca57da1b4593792d38b6eeb6d2b08e31804336667a54ef870b9f8ef" gracePeriod=2 Dec 11 06:42:29 crc kubenswrapper[4628]: I1211 06:42:29.487968 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lwvpk/must-gather-jfb8l"] Dec 11 06:42:29 crc kubenswrapper[4628]: I1211 06:42:29.893133 4628 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lwvpk_must-gather-jfb8l_7e0799c7-7ac6-46c5-8478-4edc7de737d2/copy/0.log" Dec 11 06:42:29 crc kubenswrapper[4628]: I1211 06:42:29.895208 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/must-gather-jfb8l" Dec 11 06:42:29 crc kubenswrapper[4628]: I1211 06:42:29.930085 4628 scope.go:117] "RemoveContainer" containerID="501554b130ab89d0b7ee4cc093ca6400532a7b899a4fc71fb3f1c92cf39c0e0b" Dec 11 06:42:29 crc kubenswrapper[4628]: I1211 06:42:29.969959 4628 scope.go:117] "RemoveContainer" containerID="2fcb3b907ca57da1b4593792d38b6eeb6d2b08e31804336667a54ef870b9f8ef" Dec 11 06:42:29 crc kubenswrapper[4628]: I1211 06:42:29.993771 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e0799c7-7ac6-46c5-8478-4edc7de737d2-must-gather-output\") pod \"7e0799c7-7ac6-46c5-8478-4edc7de737d2\" (UID: \"7e0799c7-7ac6-46c5-8478-4edc7de737d2\") " Dec 11 06:42:29 crc kubenswrapper[4628]: I1211 06:42:29.993965 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tbbj\" (UniqueName: \"kubernetes.io/projected/7e0799c7-7ac6-46c5-8478-4edc7de737d2-kube-api-access-8tbbj\") pod \"7e0799c7-7ac6-46c5-8478-4edc7de737d2\" (UID: \"7e0799c7-7ac6-46c5-8478-4edc7de737d2\") " Dec 11 06:42:30 crc kubenswrapper[4628]: I1211 06:42:30.002174 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0799c7-7ac6-46c5-8478-4edc7de737d2-kube-api-access-8tbbj" (OuterVolumeSpecName: "kube-api-access-8tbbj") pod "7e0799c7-7ac6-46c5-8478-4edc7de737d2" (UID: "7e0799c7-7ac6-46c5-8478-4edc7de737d2"). InnerVolumeSpecName "kube-api-access-8tbbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:42:30 crc kubenswrapper[4628]: I1211 06:42:30.098332 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tbbj\" (UniqueName: \"kubernetes.io/projected/7e0799c7-7ac6-46c5-8478-4edc7de737d2-kube-api-access-8tbbj\") on node \"crc\" DevicePath \"\"" Dec 11 06:42:30 crc kubenswrapper[4628]: I1211 06:42:30.249256 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e0799c7-7ac6-46c5-8478-4edc7de737d2-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7e0799c7-7ac6-46c5-8478-4edc7de737d2" (UID: "7e0799c7-7ac6-46c5-8478-4edc7de737d2"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 11 06:42:30 crc kubenswrapper[4628]: I1211 06:42:30.260066 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwvpk/must-gather-jfb8l" Dec 11 06:42:30 crc kubenswrapper[4628]: I1211 06:42:30.302170 4628 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e0799c7-7ac6-46c5-8478-4edc7de737d2-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 11 06:42:31 crc kubenswrapper[4628]: I1211 06:42:31.899343 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0799c7-7ac6-46c5-8478-4edc7de737d2" path="/var/lib/kubelet/pods/7e0799c7-7ac6-46c5-8478-4edc7de737d2/volumes" Dec 11 06:42:32 crc kubenswrapper[4628]: I1211 06:42:32.891198 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:42:32 crc kubenswrapper[4628]: E1211 06:42:32.891790 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:42:45 crc kubenswrapper[4628]: I1211 06:42:45.889442 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:42:45 crc kubenswrapper[4628]: E1211 06:42:45.890718 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:43:00 crc kubenswrapper[4628]: I1211 06:43:00.890410 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:43:00 crc kubenswrapper[4628]: E1211 06:43:00.890997 4628 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hvwvx_openshift-machine-config-operator(2cbe69b9-c210-427d-9807-bf7cf7a70e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" Dec 11 06:43:15 crc kubenswrapper[4628]: I1211 06:43:15.889382 4628 scope.go:117] "RemoveContainer" containerID="5b05c31f87f89c43b78fc19ffe8ab6c03a2e73afd940ed8d9219cd7950bf0f16" Dec 11 06:43:16 crc kubenswrapper[4628]: I1211 06:43:16.678515 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" event={"ID":"2cbe69b9-c210-427d-9807-bf7cf7a70e3a","Type":"ContainerStarted","Data":"5c3e42600581fdc65cfd2ed6b590deeb6ee3a4cb6a517a8bfb26f47f49897999"} Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.156864 4628 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b"] Dec 11 06:45:00 crc kubenswrapper[4628]: E1211 06:45:00.158026 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debb0104-d7f5-466d-a2d2-6efb5ccefee1" containerName="extract-content" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.158040 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="debb0104-d7f5-466d-a2d2-6efb5ccefee1" containerName="extract-content" Dec 11 06:45:00 crc kubenswrapper[4628]: E1211 06:45:00.158052 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debb0104-d7f5-466d-a2d2-6efb5ccefee1" containerName="extract-utilities" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.158060 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="debb0104-d7f5-466d-a2d2-6efb5ccefee1" containerName="extract-utilities" Dec 11 06:45:00 crc kubenswrapper[4628]: E1211 06:45:00.158082 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debb0104-d7f5-466d-a2d2-6efb5ccefee1" containerName="registry-server" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.158089 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="debb0104-d7f5-466d-a2d2-6efb5ccefee1" containerName="registry-server" Dec 11 06:45:00 crc kubenswrapper[4628]: E1211 06:45:00.158116 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0799c7-7ac6-46c5-8478-4edc7de737d2" containerName="copy" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.158123 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0799c7-7ac6-46c5-8478-4edc7de737d2" containerName="copy" Dec 11 06:45:00 crc kubenswrapper[4628]: E1211 06:45:00.158138 4628 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0799c7-7ac6-46c5-8478-4edc7de737d2" containerName="gather" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.158145 4628 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0799c7-7ac6-46c5-8478-4edc7de737d2" containerName="gather" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.158405 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0799c7-7ac6-46c5-8478-4edc7de737d2" containerName="copy" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.158424 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0799c7-7ac6-46c5-8478-4edc7de737d2" containerName="gather" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.158437 4628 memory_manager.go:354] "RemoveStaleState removing state" podUID="debb0104-d7f5-466d-a2d2-6efb5ccefee1" containerName="registry-server" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.159316 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.164414 4628 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.165086 4628 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.169679 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b"] Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.258147 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acb71d09-6b53-4304-a3eb-8e3584bb29aa-config-volume\") pod \"collect-profiles-29423925-8p42b\" (UID: \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.258531 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fz4x\" (UniqueName: \"kubernetes.io/projected/acb71d09-6b53-4304-a3eb-8e3584bb29aa-kube-api-access-4fz4x\") pod \"collect-profiles-29423925-8p42b\" (UID: \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.259019 4628 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acb71d09-6b53-4304-a3eb-8e3584bb29aa-secret-volume\") pod \"collect-profiles-29423925-8p42b\" (UID: \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.361002 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acb71d09-6b53-4304-a3eb-8e3584bb29aa-secret-volume\") pod \"collect-profiles-29423925-8p42b\" (UID: \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.361111 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acb71d09-6b53-4304-a3eb-8e3584bb29aa-config-volume\") pod \"collect-profiles-29423925-8p42b\" (UID: \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.361215 4628 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fz4x\" (UniqueName: \"kubernetes.io/projected/acb71d09-6b53-4304-a3eb-8e3584bb29aa-kube-api-access-4fz4x\") pod \"collect-profiles-29423925-8p42b\" (UID: \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.362028 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acb71d09-6b53-4304-a3eb-8e3584bb29aa-config-volume\") pod \"collect-profiles-29423925-8p42b\" (UID: \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.367526 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acb71d09-6b53-4304-a3eb-8e3584bb29aa-secret-volume\") pod \"collect-profiles-29423925-8p42b\" (UID: \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.380030 4628 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fz4x\" (UniqueName: \"kubernetes.io/projected/acb71d09-6b53-4304-a3eb-8e3584bb29aa-kube-api-access-4fz4x\") pod \"collect-profiles-29423925-8p42b\" (UID: \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.489468 4628 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:00 crc kubenswrapper[4628]: I1211 06:45:00.986463 4628 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b"] Dec 11 06:45:01 crc kubenswrapper[4628]: I1211 06:45:01.604541 4628 generic.go:334] "Generic (PLEG): container finished" podID="acb71d09-6b53-4304-a3eb-8e3584bb29aa" containerID="45b81e018c604de8c83279f77a7fd8c67248a110757b24d0bd8ef7b1c5b6a704" exitCode=0 Dec 11 06:45:01 crc kubenswrapper[4628]: I1211 06:45:01.604611 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" event={"ID":"acb71d09-6b53-4304-a3eb-8e3584bb29aa","Type":"ContainerDied","Data":"45b81e018c604de8c83279f77a7fd8c67248a110757b24d0bd8ef7b1c5b6a704"} Dec 11 06:45:01 crc kubenswrapper[4628]: I1211 06:45:01.605882 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" event={"ID":"acb71d09-6b53-4304-a3eb-8e3584bb29aa","Type":"ContainerStarted","Data":"23792781e03d0bc6c8f4e0aa0b948bb0daa4e005eafa5a2841aaf7e349089633"} Dec 11 06:45:02 crc kubenswrapper[4628]: I1211 06:45:02.958482 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:03 crc kubenswrapper[4628]: I1211 06:45:03.130340 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acb71d09-6b53-4304-a3eb-8e3584bb29aa-secret-volume\") pod \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\" (UID: \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\") " Dec 11 06:45:03 crc kubenswrapper[4628]: I1211 06:45:03.130570 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acb71d09-6b53-4304-a3eb-8e3584bb29aa-config-volume\") pod \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\" (UID: \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\") " Dec 11 06:45:03 crc kubenswrapper[4628]: I1211 06:45:03.130662 4628 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fz4x\" (UniqueName: \"kubernetes.io/projected/acb71d09-6b53-4304-a3eb-8e3584bb29aa-kube-api-access-4fz4x\") pod \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\" (UID: \"acb71d09-6b53-4304-a3eb-8e3584bb29aa\") " Dec 11 06:45:03 crc kubenswrapper[4628]: I1211 06:45:03.131463 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acb71d09-6b53-4304-a3eb-8e3584bb29aa-config-volume" (OuterVolumeSpecName: "config-volume") pod "acb71d09-6b53-4304-a3eb-8e3584bb29aa" (UID: "acb71d09-6b53-4304-a3eb-8e3584bb29aa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 11 06:45:03 crc kubenswrapper[4628]: I1211 06:45:03.144778 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb71d09-6b53-4304-a3eb-8e3584bb29aa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "acb71d09-6b53-4304-a3eb-8e3584bb29aa" (UID: "acb71d09-6b53-4304-a3eb-8e3584bb29aa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 11 06:45:03 crc kubenswrapper[4628]: I1211 06:45:03.144884 4628 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb71d09-6b53-4304-a3eb-8e3584bb29aa-kube-api-access-4fz4x" (OuterVolumeSpecName: "kube-api-access-4fz4x") pod "acb71d09-6b53-4304-a3eb-8e3584bb29aa" (UID: "acb71d09-6b53-4304-a3eb-8e3584bb29aa"). InnerVolumeSpecName "kube-api-access-4fz4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 11 06:45:03 crc kubenswrapper[4628]: I1211 06:45:03.233400 4628 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acb71d09-6b53-4304-a3eb-8e3584bb29aa-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 11 06:45:03 crc kubenswrapper[4628]: I1211 06:45:03.233436 4628 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acb71d09-6b53-4304-a3eb-8e3584bb29aa-config-volume\") on node \"crc\" DevicePath \"\"" Dec 11 06:45:03 crc kubenswrapper[4628]: I1211 06:45:03.233446 4628 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fz4x\" (UniqueName: \"kubernetes.io/projected/acb71d09-6b53-4304-a3eb-8e3584bb29aa-kube-api-access-4fz4x\") on node \"crc\" DevicePath \"\"" Dec 11 06:45:03 crc kubenswrapper[4628]: I1211 06:45:03.623532 4628 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" event={"ID":"acb71d09-6b53-4304-a3eb-8e3584bb29aa","Type":"ContainerDied","Data":"23792781e03d0bc6c8f4e0aa0b948bb0daa4e005eafa5a2841aaf7e349089633"} Dec 11 06:45:03 crc kubenswrapper[4628]: I1211 06:45:03.623575 4628 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23792781e03d0bc6c8f4e0aa0b948bb0daa4e005eafa5a2841aaf7e349089633" Dec 11 06:45:03 crc kubenswrapper[4628]: I1211 06:45:03.623678 4628 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29423925-8p42b" Dec 11 06:45:04 crc kubenswrapper[4628]: I1211 06:45:04.085957 4628 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4"] Dec 11 06:45:04 crc kubenswrapper[4628]: I1211 06:45:04.095981 4628 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29423880-6t2x4"] Dec 11 06:45:05 crc kubenswrapper[4628]: I1211 06:45:05.902186 4628 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5892c32-a678-4f96-aaa9-03f39b6fe036" path="/var/lib/kubelet/pods/c5892c32-a678-4f96-aaa9-03f39b6fe036/volumes" Dec 11 06:45:30 crc kubenswrapper[4628]: I1211 06:45:30.160189 4628 scope.go:117] "RemoveContainer" containerID="5fab9aa013bea59abbe59841a20d9131aa481616c58edc44e60748ae722774c6" Dec 11 06:45:31 crc kubenswrapper[4628]: I1211 06:45:31.427420 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:45:31 crc kubenswrapper[4628]: I1211 06:45:31.427731 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 11 06:46:01 crc kubenswrapper[4628]: I1211 06:46:01.427273 4628 patch_prober.go:28] interesting pod/machine-config-daemon-hvwvx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 11 06:46:01 crc kubenswrapper[4628]: I1211 06:46:01.428203 4628 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hvwvx" podUID="2cbe69b9-c210-427d-9807-bf7cf7a70e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"